› Forums › Personal Topics › Unbidden Thoughts › AI-based Literate Programming Modules for Edu SW Students
This topic contains 3 replies, has 1 voice, and was last updated by
josh July 16, 2021 at 6:48 pm.
-
AuthorPosts
-
July 16, 2021 at 6:17 pm #97176

joshQ: Where will the NLP interface add long term power in addition to teaching newbies?
A: There’s more than one type of answer.
The original Knuth point about LP still holds – people devote different amounts of time to different projects & may come back to something after a year or two & struggle to remember the original logic/rep. Using formats that are close to natural language helps with that & with new people picking up where another person left off.
In the Expert System context – e.g. getting a robot to behave well in a medical setting or a household setting, there is a huge amount of context. We need Expert AI/CASE tools to form disjunctive union of old work & new work & then compile it, checking for contradictions & semantic unit tests. The expert programming interface shouldn’t be a mind numbing scanning of an incredibly long list of declarations in mixed, dense format. Parsimonious expression is not going to ever be the productive work interface at that level – it’s always going to be “dvi” or something in the pipeline.
My intent is not to expect rapid convergence of relatively set metalanguages. Experience & research will show where new types of power are needed. The NLP user may end up saying things that affect some architectural configuration in software. I don’t want to start with the assumption that the pro interface to software architecture will end up looking like some existing language of today, as the system continues to evolve.
If you do a google search on NLP now, it is returning lots of links for neuro-linguistic programming. I have a lot of involuntary experience with “mind reading” here & progress is being made. It is likely that forms other than typing will become increasingly important as methods of development. In that case, the input level can’t be ascii and isn’t likely to be a straight text translation – pictures etc will be used along with vectors of dynamic brain state, processed.
The domain expert who is not intending to become a programming jock is also a key beneficiary for some work.
Put all those things together.
-
July 16, 2021 at 6:32 pm #97177

joshMy intuition about the way people like to talk about semantic design says that the NLP workflow interface will support including statements about modifying instances of understood “objects”. That doesn’t mean that the deeper system is really using Prototype-based Programming in its run-time operations or even in its code that’s most similar to high level source code. It’s more like a level of macro at the designer level and AI-logic about similarity of parallel operations – e.g. I created a penguin character & now I want to create a walrus, inside of the game system, using a knowledge base about penguins & walruses & the knowledge of what went into creating the penguin.
-
July 16, 2021 at 6:48 pm #97178

joshQ: The great system for defining the common structure of animal characters in the general purpose VR framework, blah, blah might be someone’s entire master’s thesis in software engineering. How can it be a part of a casual interface to NLP?
A:There isn’t support for including all linguistic concepts of common knowledge. There’s context smart support for mapping words to entries in specific knowledge databases (specifying those is a hidden parameter) and basic level generalization of operations between the parallel database entries & the literate programming representation. Perhaps the first path between the 2 is all some sort of coding at the logic level. Perhaps the 2nd try can use the original map but a “bug” in the original logic/mapping is discovered in the process of looking at that. Where is the algorithm & the bug? It’s not at the source code level. It’s in the meta-programming level of the NLP using “IDE”. Maybe someone will come along & do a master’s thesis on adding tag features to the knowledge base & the literate representations with the result that there is a first prototype path for the mapping. This may also turn out to have bugs or need adjustments on inspection. Conceptually that is a new Wild West. I’m not advocating to go crazy. I’m saying the right shape isn’t just 1 or 2 steps from design language to conventional source code. A metaprogramming environment with NLP is relevant.
-
-
-
AuthorPosts
You must be logged in to reply to this topic.