I was wondering today how different programming and natural languages really are, and if one day someone could write applications in a natural language - maybe even just speak alike you would to a human, to a computer, and have it understand what it needs to do and be able to recognise itself the efficient methods for approaching the problems provided and use them.
This has raised two sets of questions for me. They're don't exactly all flow, they're very much separate questions.
The first is, if we define "programming" as being to "provide (a computer or other machine) with coded instructions for the automatic performance of a task." (thanks, Google), would this qualify telling your phone to set an alarm as "programming"? This can certainly be done even by voice command, now... and it meets the definition - you're providing a computer of other machine, in this case your phone, with instructions (coded in a natural language of your choice), that'll follow a syntax it can recognise (e.g. "Set an alarm for _ o'clock"), so it later automatically performs the task of ringing/vibrating/setting off fireworks under your bed/yodeling/whatever. You've just programmed your phone to set off an alarm - making the natural language you just used a programming language itself, too? Would that make all natural languages also programming languages? If you're willing to push the "other machine" part of that definition to include humans, natural languages would definitely qualify as being programming languages - "instructing" would become almost synonymous with "programming". I think someone's already had a dig at this and that's how the esoteric language "chef" happened, maybe?
The next is something I came across when people were comparing natural and programming languages - error tolerance. With most natural languages there's a high error tolerance, yuo hfal cna dpro snetenesce, rwod mujble nda lstli nudsterood. With programming languages, error tolerance is typically pretty strict - misspell a variable name or omit a curly brace and you're out of luck... but this made me wonder, would it be possible to write a compiler or interpreter with error tolerance alike that I used in the sentence I hope you understood above? In golang you can omit variable types upon declaration by using "short variable declarations" where the type for the variable is inferred from what's on the right hand side of the declaration - maybe alike how in conversation I could some words and you'd still be to what I meant (inb4 grammar nazi I omitted some words there to prove the point).
Would it be that much of a stretch to write a plugin for something like visual studio code that at least suggests what you might've meant to have typed when you misspelt a vrabiael nmae, or place suggested semicolons where you've put a newline but forgotten the semicolon? Is this also something people have thought about before but dismissed the idea of out of fear that it'd result in more problems like it thinking you've omitted something by accident that was in fact deliberate?