A fairly surreal and probably overly optimistic example would be, for example, to solve traveling salesman problems using chess engines. What we would need is to find right mappings: (1) from a traveling salesman problem to a chess position and, (2) from a traveling salesman route to a chess move (or move sequence)
A general solution for a "compiler" that can translate between any pair of problems feels unrealistic but I can imagine developing a mapping between, say, a tic tac toe game and simple chess positions where you could: (1) translate a tic tac toe position into a chess position (2) solve the chess position (3) translate the solution into a tic tac toe sequence
Any thoughts or pointers to relevant research would be much appreciated!
“The easiest way to prove that some new problem is NP-complete is first to prove that it is in NP, and then to reduce some known NP-complete problem to it”
As a second example, in combinatorial game theory, the Sprague–Grundy theorem states
“every impartial game under the normal play convention is equivalent to a one-heap game of nim, or to an infinite generalization of nim. It can therefore be represented as a natural number, the size of the heap in its equivalent game of nim, as an ordinal number in the infinite generalization”
(https://en.wikipedia.org/wiki/Sprague–Grundy_theorem)
That means that, presented with an impartial game (both players can make the same moves, so chess is ruled out because white can’t move black pieces and vice versa) under the normal play convention (last player who can make a move wins), mathematicians look for a way to translate game positions to nimbers (https://en.wikipedia.org/wiki/Nimber) in order to learn how to play them.
Doing such mappings, if not trivial, requires creativity, so I think research on the subject would be in the psychology department.
Polya, in “How to Solve it” has some discussion on this (https://en.wikipedia.org/wiki/How_to_Solve_It#Heuristics)
https://en.wikipedia.org/wiki/Reduction_(complexity)
In practice, there are many cases where people use SAT solvers for other problems. For some examples:
A small excerpt: "… We have to look around for closely related problems; we look at the unknown, or we look for a formerly solved problem which is linked to our present one by Generalization, Specialization, or Analogy"
A simple example of this is: all even integers are divisible by two. That's two sets: the set of even numbers and the set of numbers divisible by two. For any x that is an even number, it is equal to 2k for some integer k. This implies that x/2 = 2k/2 = k. Since x/2 is an integer, it is divisible by 2 QED.
It is also possible to prove that if an integer is divisible by 2, it is even. That's a different proof.
The fact that you can solve problem A in terms of problem B doesn't always mean you can solve problem B in terms of problem A. Just because great minds think alike, doesn't mean people who think alike are great minds.
All problems reducing to other problems in the technical sense are built on such foundations.
The specific area you're thinking of is theoretical computer science.
You might like the textbook "Introduction to the theory of computation" by sipser. It starts by showing the mutual-problem-reducibility of regular languages (AKA the language of deteinistic finite automata / regular expressions) and moves step by step into Turing machines which support general purpose programming languages. And beyond (languages which can only be conputed by hypothetical machines which we don't know how to build in reality.)
MIT 6.890 https://ocw.mit.edu/courses/electrical-engineering-and-compu...
Basic idea is map a hard problem A (e.g TSP) to some other problem X (e.g chess) by finding “gadgets” then you know X is at-least as hard as A (a lower bound on X).
You can't solve NP-complete problems efficiently, but SAT (and 3SAT) is basically the best attempt at that complexity class.
-----------
The constraint satisfaction problem is another one that is sometimes used.
For example, Sudoku, Traveling Salesman, coloring problems and more all reduce into the 3SAT problem. However, a dedicated traveling-salesman solver will be faster than pretty much any general purpose 3SAT solver, but its still easier to use another person's solver than writing your own.
---------
An intriguing "simpler" problem is the maximum-flow problem, which is surprisingly flexible and usable in many many algorithms. Its not as widely applicable as 3SAT is, but maximum-flow is "efficient" to solve (in P-time and P-space).
---------
A good book into this (including these "cannonical" problems, like maximum flow or 3SAT) is "Algorithm Design" by Jon Kleinberg and Eva Tardos. This only covers the basics of course. 3SAT and constraint satisfaction are their own respective fields with basically their own branches of mathematics.
A lot of Graph algorithms are also worth knowing. It seems like many, many problems map into graph algorithms (max-clique, topological sort, minimum spanning tree, etc. etc.)
> A general solution for a "compiler" that can translate between any pair of problems feels unrealistic but I can imagine developing a mapping between, say, a tic tac toe game and simple chess positions where you could: (1) translate a tic tac toe position into a chess position (2) solve the chess position (3) translate the solution into a tic tac toe sequence
Yup, that's a 3SAT solver for ya. It will solve the problem, eventually, but 3SAT is NP complete, so the heat-death of the universe may come about before the answer comes out.
If you know your problem is less complex than NP-complete, you'll want to map it to some other problem that's got less complexity (so that the algorithm finishes faster).
Just wanted to share for anyone interested that there is in depth research and theory developed in cognitive science concerning the way people use what the field calls conceptual blending to make sense of unfamiliar subjects with familiar concepts. Maybe it's worth taking inspiration from!
Research study observing the effectiveness of this ability in humans (pdf):
https://deepblue.lib.umich.edu/bitstream/handle/2027.42/2533...
Book on the wider subject:
https://www.amazon.com/Way-We-Think-Conceptual-Complexities/...
https://en.m.wikipedia.org/wiki/Category_theory
https://open.spotify.com/episode/6v01kNIPZZTmQk483nFy3H?si=8...
All over mathematics and problem solving we frequently find many techniques of converting the problem we are trying to solve into something easier. I'd even argue that this is what mathematics and most of science is in general. After all, everything we are doing is an approximation. We can't solve the universe, but we can dictate what we see with a carefully laid out language (physics) and use that to describe what we see and make predictions based on it. Everything is just a model and a model is a map. (I'd go as far as arguing that we do this with language, not just in analogies, but in so far as saying that the existence of language itself is a map from a difficult and intractable problem to an easier one)
The thing though is that not everything can be mapped to anything else. That's the real hard part. For example you would not have a bijective mapping from tic-tac-toe to chess and you can prove this by looking at the number of game states that each contains. TTT to chess would clearly be a non-injective mapping.
So it is hard to give you a tip into specific forms of research without knowing more specificity. I would encourage you to see the world like this. This is why many of these fields will encourage you to look at problems through different lenses. Why you'll often see many of the big breakthroughs in fields are connecting ideas from other fields (Nash and Einstein are notable and relatively recent examples). There was Terrence Tao's post about how to solve problems on HN just the other day (and many references to Polya's book) and I'll say that you will find the same recommendations there.
I don't know if there is any specific field that covers this topic but rather I think every field _is_ this topic.
[1] https://ieeexplore.ieee.org/iel5/32/5010265/05010283.pdf
Many problems can be reduced to SAT and then you can employ an off-the-shelf SAT/SMT solver to solve it for you. Etc etc.
But in general, being able to reduce problem A to problem B implies that A is not harder than B. I.e., to show that a problem P is undecidable, you can reduce the Halting problem to P.
TSP is NP-hard and can be reduced to SAT, then solved with a SAT solver. But something like determining whether a player has a winning strategy in unrestricted chess (with an nxn board, for arbitrary n) is in EXPTIME and can't be reduced to an "easier" problem.
Edit: Someone ITT also mentioned ILP (integer linear programming). Also a good example. E.g., many optimization programs can be mapped to ILP.
https://en.m.wikipedia.org/wiki/Duality_(mathematics)
For example, in geometry, there is an equivalence between Voronoi diagrams, Delanauy triangulation, convex hulls and plane intersections. This is of very practical importance (for me at least!) because you can reuse non trivial algorithm/libraries that solve one problem to solve the other. Duality is also very important in optimization, where you transform a problem, the primal, into it's dual and solve one or both of them simultaneously.
I have not found a lot of info in "pure" duality, it's a meta-concept present in a lot of different mathematical areas but I'm sure there must be mathematicians looking at it
In practice this almost always means translating to the formats used by (Mixed) Integer Programming (MIP) solvers, Constraint Programming (CP) solvers or SAT solvers.
Satisfiability Modulo Theories (SMT) solvers are used more in the formal verification realm, but they can also be used for the same purpose, the problems solved by these tools are all NP-complete so you can with more or less effort translate any NP problem to them.
“Understanding a thing is to arrive at a metaphor for that thing by substituting something more familiar to us. And the feeling of familiarity is the feeling of understanding.”
“Understanding in science is the feeling of similarity between complicated data and a familiar model.”
“The Bohr model of the atom is that of a proton surrounded by orbiting electrons. It is something like the pattern of the solar system, and that is indeed one of its metaphoric sources.”
This is an accessible paper on the general concepts:
https://arxiv.org/pdf/2101.11115.pdf
Abstract: “We solve complex problems by separating them into manageable parts [2,86]. Human designers do this intuitively, but details can quickly overwhelm intuition. Multiple aspects of a problem may lead to distinct decompositions and complementary models of a system– e.g. competing considerations for cyberphysical systems [63,87]–or simulation of behavior at many levels of fidelity–e.g. in modeling and simulation [88]–leading to a spectrum of models which are challenging to align. We argue that operads, formal tools developed to compose geometric and algebraic objects, are uniquely suited to separate complex systems into manageable parts and maintain alignment across complementary models”
John Baez has some inspiring work in this area as well.
Network Models is a good paper particularly on Network Operads
(By the way if anyone is looking for a Software Engineering Manager in New York City with a specialization in AI, see my profile and get in touch!)
It's possible that we could decompose the complex problems into simpler ones and the solution for a subset could be shared. Although in most cases I feel that the effects will be at least slightly different as far as the n-order effects.
For example, the solution to inflation could be tying a number to automatically adjust based on CPI. This might make sense for a SS payment and COLA. It might not make sense for basis adjustment of school property taxes as other variables could adjust the revenue need, like decrease in students or increase in funding from other sources. It looks like they map on the surface, but not when you dig deeper.
If you want to be even more general, check out a field of mathematics called category theory (https://en.wikipedia.org/wiki/Category_theory). Its' central idea is to describe mathematical structures in terms of "objects" and their relationship with each other (called morphisms). I'm not extremely knowledgeful in this subject, but I believe mostly everything you get out of categorical theoretical treatment is too generic to be useful, but has resulted in some real results in e.g. algebraic geometry.
This game is of course an imperfect information game (you don’t know your opponents hole cards). We have lots of great algos for perfect information games but not imperfect.
So what the researches did is they mapped hold em to a perfect information game by tweaking it: now nobody knows their hole cards and all players must publicly announce their strategy to a “referee”. Then the referee looks at the players cards and places bets on behalf of the players. A player’s strategy looks something like: if I have a pair of aces then I want to raise 94% of the time and call 6% of the time and fold 0% of the time etc etc.
You know your opponent’s strategy too (as they have announced it like you did to the referee) so you can now iterate and optimise your strategies which will approach a Nash equilibrium.
As an example from mathematics Fermat's Last Theorem was first proven conditionally on a few conjectures. In your words, it was mapped to these conjectures (though not bijectively, Fermat's theorem didn't imply these conjectures as far as I know).
Later on Andrew Wiles proved the last of these conjectures, thus establishing Fermat's Last Theorem as absolutely true and not just relative to these conjectures.
There's quite a bit of mathematics that's only 'true' relative to the assumption of the Riemann hypothesis. See eg https://mathoverflow.net/questions/17209/consequences-of-the...
Another one from System Designers is called an Archetype. Most problems can be reduced to an archetype for understanding and solving effectively. https://thesystemsthinker.com/topics/archetypes/
A borrowable digital copy is at the Internet Archive: https://archive.org/details/bypasses00zdzi
How do we map programming to understanding genetic codes?
How do we map psychological research or zoological data to operations research?
They often start from something unrelated and work towards solving something very different.