› Forums › Personal Topics › Unbidden Thoughts › The Modern Industrial Templates For "Best First Search"
This topic contains 3 replies, has 1 voice, and was last updated by
Josh Stern November 26, 2022 at 5:12 pm.
-
AuthorPosts
-
November 26, 2022 at 6:21 am #124991

Josh Stern
ModeratorElements of the graph structure itself may also be dynamically updated in some applications. Where it is viable, metaprogramming allows the use of threading guards only where they are needed.
-
November 26, 2022 at 12:20 pm #125017

Josh Stern
ModeratorMore food for example & feature thoughts:
If an optimization problem has geometric regions with Lipschitz conditions, then graphs can be made out of an epsilon-cover of the space – this guarantees finding a global optimum, while features may be learned or heuristic for speeding the location of that region, allowing some other evals to be elided.
If a region is not geometric with such conditions, what gives rise to search pruning? Perhaps probabilities based on stat sampling of marginals & a model based on lower order marginal stats is deemed to be a good fit. Coming to that POV may arise after many example problems have been encountered. That could be a hot plug in a very long running routine.
The model of environmental boxes that I sugest for language semantics can also be used to case based reasoning, to decide when a case is really in the same or an entirely different region of operation with different probability models. A lot of that decision is based on accepted scientific theories, rather than simply case based experience.
In general, we expect to find many more valuable applications for “solvers” if we can fit them into smarter, faster, more robust, reusable, & scalable harnesses.
-
November 26, 2022 at 5:12 pm #125020

Josh Stern
ModeratorIn the general case, Best First Search includes “Bandit Problems” where the formal formulation of the problem includes a material impact for information about the problem that is gained by the process of search/experiments. The library options for supporting the “lambda” that computes best first search can helpful support further parameterization within an internal structure that reflects intelligent practice about such problems.
What is the structure? This is a truly interesting spot for theoreticians to focus. How to decompose it? I want to include concepts of annealing & diversity search & feasible regions & satisficing.
The most natural ways of describing those bits of knowledge in terms of initial knowledge, learning gained about the problem class from previous trials of like problems, & dynamic learning in the current run will probably require some “dynamc compilation” to fit into fast executable form. Focus on that. Abstractly we have:
CurrentProblemProgressDesc (includes rate of solution improvement, feasibleFound?, degree of config space ‘covered’, relation to solution class…)
FinalSolutionMetrics
LastCompiledVersion – bytecodeish, time of doing that…
InfoAboutTheGraphBoundaryWeSearch
HowWeHashThisForMachineLeaerningData …
Other…
Work on that for AI smarts we can use
-
AuthorPosts
You must be logged in to reply to this topic.