I don't believe there exists such a thing as a ´good' methodology to solve any kind of problems, let alone a ´best' methodology. Take for instance the Millenium problems in mathematics, which have a quite precise statement: I would really be interested in seeing a ´methodology' at work.
I use the axiomatic method. Start by clearly stating what is accepted as true and then prove theorems one after another based on your previous results until you have the solution to your problem.
How about the Scientific Method? https://en.wikipedia.org/wiki/Scientific_method
It's overkill in a lot of situations. If your goal is to buy a used car for cheap, make a profitable app inside Facebook, or do whatever else that other humans already know how to do, guessing can work fine, and is a lot less work.
But if you're operating at the frontier of human knowledge, like trying to discover new biotech drugs, you don't know enough to guess correctly, and will inevitably end up wrong too often to make any progress at all. In that case, rigorous adherence to the scientific method is our only option.
my question is half part between the cheap car and the frontier of human knowledge.
You both give me interesting solutions and methodology. Indeed, the scientific or axiomatic methods have done its prove. What I miss in your both method it is the estimation of what we doesn t know.
It is always possible to build knowedge and solutions using and iterative methode of experimental test or demonstration. What I am searching for it is a way to devellope solution without iterative methods. Go staith to the aim even if you know only partially.
how to estimate what you don't know before search about it?
(I know, this is a crazy question, but we are here to search in the "frontier of human knowedge").
At first we can imagine all the initial condition we want...
I am not entirely sure what you are asking, but I can advise you not to make the mistake that Gerard Debreu did with regards to economics. I quote Steve Keen:
"It is almost superfluous to describe the core assumptions of Debreu’s model as unrealistic: a single point in time at which all production and exchange for all time is determined; a set of commodities – including those which will be produced in the distant future – which is known to all consumers; producers who know all the inputs that will ever be needed to produce their commodities; even a vision of “uncertainty” in which the possible states of the future are already known, so that certainty and uncertainty are formally identical. Yet even with these breathtaking dismissals of essential elements of the real world, Debreu’s model was rapidly shown to need additional restrictive assumptions."
A few years ago I was introduced to the notions of C-K Theory (https://www.researchgate.net/publication/244995881_CK_Design_Theory_An_Advanced_Formulation) introduced by Hatchuel and Weil in 2003.
The interesting thing about this approach is the interplay between the concepts (C) and the knowledge (K) needed to address the concept. I have used it to catalogue *my* existing knowledge and to identify what else I might need to know to make the concept "real". Of course, this theory is used in conjunction with the methods described above (e.g. Axiomatic, Scientific Approach and TRIZ). It seemed to be a good strategy for me to identify my limits of knowledge to solve a problem - albeit an engineering problem rather than pushing back the frontiers of knowledge!
verywell, it is the sort of methodology I am searching for. A non iterative solution designing method from analogy and general concept. Something like a paradigm of the knowedje utilisation to solve problems.
I was visit a museum in Sochau : the museum of Peugeot and you can see the evolution care after care. When you watch a car and the next one, you can see the evolution of designing. Sometime it is from a new scientific discovery but most part of time the new devellopement is a sort of optimization of the previous one. There are a lot of iteration of optimization to go to a modern car.
When you change something to make it more optimal, you need to change something else to make more optimal too. When you have a great idea to improve the system, it is exist already in future a better idea which you will see later (even if you was know alreaddy everything you need to know to have this idea...)
Finally research is very slow because this phenomena, it would be great to be able to jump the intermediar step to go directly to the best (not only quantitatively but also qualitatively).
I understand what you say about the danger of imagine what you doesn't know. I am agree about the necessity of a serious method (axiomatic, scientific...) But this methods permit to validate a knowedge. Iam searching for the WISDOM or a knowedge of the WISDOM.
I'm not sure that I fully agree with you that " ... necessity of a serious method (axiomatic, scientific...) But this methods permit to validate a knowledge." The main reason being that these approaches are tactics for discovering greater insight into the knowledge; for example, actively seeking ways of collapsing (falsifying) the hypothesis and every time the hypothesis is rendered true it becomes stronger. Validation implies "correct given the context"; whereas experimentation implies discovering the context and learning about the behaviour outside this context.
Apologies for the vagueness of this, I'm trying to recall the philosophy of Karl Popper that I read many years ago. Just writing this makes me think that it is time for me to re-visit it.
I am agree that experiment is the lonely way to be sure at 100% that something is true or good.
Let is take the problem differrently. We are working to improuve or devellope something using the axiomatic and the scientific methods. We get a result and we do an experiment to see this result outside its devellopement context. The aim is to have the nearest experimental results as possible of the theorical result...
It is moving the problem to "how I do to get the most good context?". This remarque send me back to your first answer about "to identify my limits of knowledge to solve a problem".
I guess from the scientific method + axiomatic method + problematic about the limits of knowedge (solve by C and K) it is possible to get organized good.
I thank you all for your answers and the refferences you have given to me. I guess I need to read about all of this more in details.
It still be some questions about the control of the focus of the ideas when you decide to write your problematic and to select the knowedge you will use (PNL in the creativity).
* in every iteration (or cycle on the picture) : you do asumptions and you validate by experimental test. Every validation of an asumption is the creation of a new information and sometime by generalisation with many experiments you prouv a new rule and it mean a new knowedge.
* in mathematic : you have axioms and theorems and from an information considered as true you can deduct a new information considered as true (if the first one is true) = demonstration.
* Knowedge engineering : you have informations, you have rules or other informations as "predicat" and you can create new information from this (this is the symbolic cognitive way of IA to think)
I am speaking about everything (I hope no) in general, we know way to create new informations, but they all use iterative methods one step and after a next one(same process again of analysis) or if you prefer cyclic methods.
in mathematics, you have a concept : "the cardinality", if you cut a space in n parts and if you know the n-1 first parts you are able to conclude what is the last part you doesn't know yet. Can you do this with knowedge conceptualisation? :
the ball is grey, the ball is round, [method] --> the ball have a... temperature, what is this temperature?
I know it looks crasy... but everything can permit to go more next to this is welcome ^^
I don't believe there exists such a thing as a ´good' methodology to solve any kind of problems, let alone a ´best' methodology. Take for instance the Millenium problems in mathematics, which have a quite precise statement: I would really be interested in seeing a ´methodology' at work.
The best method consists of two steps. First one: provided (perhaps already available) a physical meaning. Second step: use your physical feeling as a basis for constructing a mathematical approach.
Most of my problem solving was done in one of three ways. The first is pure chance. Often when trying to answer one question I accidentally discover the answer to some other question. I write these down in a notebook in case I ever care about that question (occasionally I do). The second is to solve the problem backwards. I start with an answer and try to figure out what question it answers. Integration tables are constructed this way. We can differentiate just about any expression that we can write down and whatever we get is something that we know how to integrate. The third method is pure stubbornness. I will keep at it until I get the answer even if it takes years. My life time is not long enough to solve a lot of problems that way but I did get a few done. But I have never discovered an efficient method for solving a problem that has not already been solved. I doubt that there is a method that can substitute for luck (thinking of the right thing at the right time, or finding the answer by accident) and creativity.
I don't believe there is a general methodology for all problems. Quite often, however, you may find it helpful to
a) consider special cases and to adapt any such special solution to the general problem. You may also test any ideas for the general problem and see (qualitatively and/or quantitatively) whether the idea is at least OK for the special case
b) use analogy to other problems with known solution
c) simulate the system (at least an example) for which you want to solve a problem and see whether you can use the simulation to identify or test possible solutions
d) consider if a general solution would be computable at all
e) look at asymptotic regimes for one or all parameters of the problem (system) and try to find asymptotic series as solutions
f) try to split your problem into a solvable part and a rest that may be treated perturbatively.
g) use symmetry considerations
h) see whether the problem or an analogous one has been solved in a neighboring discipline and learn how it was solved there
i) discuss your problem with as many people as you can find
j) learn from the best researchers in the field
k) mix or switch approaches, e.g. algebraic ones with geometric ones etc.
In a sense, learning science is just learning how to solve problems. Of course this is not restricted to science.
Herbert mentioned something that I completely forgot about. Simulations. I assume that we are discussing difficult problems because easy problems can be solved by remembering how similar problems were solved in the past. Simulations, which I forgot to mention in my earlier post, were an enormous help in my work. They can show which ideas are bad ideas and which ideas are promising enough to be worth further investigation..
We should define the situation or problem in clear manner. We approach the problem from various directions by using critical thinking. Pick the better solution to solve our challenge and the better solution by comparing various possible solutions against our problem. Prepare for the worst possible outcome and how to overcome it. Before implementing our decision, which one is the worst possible thing that can happen if this decision doesn’t work? Set measures on our decision. Take complete responsibility for our decision. Accept complete responsibility for implementing the decision. Many of the most creative ideas never materialize because no one is specifically assigned the responsibility for carrying out the decision. A decision without a deadline is a meaningless discussion. If it is a major decision and we will take some time to implement, set a series of short-term deadlines and a schedule for reporting. Develop a sense of urgency. The faster move in the direction of our clearly defined goals. We will develop our capacity to achieve even more in the future. We can solve any problem, overcome any obstacle or achieve any goal that we can set for our self by using our creative mind and then taking action consistently until we get objective.
What you describe may apply for instance to technical problems arising in the context of a company. In that case, you have to come up with the "best" solution (which may not even be a "good" solution) in a limited time frame. It doesn't work like that in the context of "hard" scientific problems. Take for instance the unification of quantum mechanics and general relativity in physics, or the Millenium problems in mathematics. I doubt setting a deadline will help...
All the solutions you give to me start by the acceptation that you doesnot control what you will discover in futur. You apply technics and processes step by step or searching more or less randomly in the global memory.
I see that this problematic is very tactical and depending the scientific context. It would be nice to devellope a theory based on everything you have said. You know, something like a "scientific organisation of the work" but for the research.
Here is something that I learned by getting suggestions from others on Research Gate. Suppose the problem to be solved is one in which a good answer can be distinguished from a not-as-good answer by producing a smaller value of something bad. The classical example is to find the best fit to some data as defined by the fit that minimizes some measure of error. I asked for recommendations on how to do this (for a more specific problem of finding the fitting parameters that make some function a best fit to data) on Research Gate and learned from the experts that no general procedure has yet been invented. Methods that follow the gradients of functions (for example) are effective for some kinds of problems but not for problems in which there are numerous relative minimums in the error measure. Advice from the experts motivated the method that I ended up using and that got the job done. This two-step process starts with a random search to separate one relative minimum from another, followed by the gradient method for fine-tuning. The most difficult problem was to convince my co-workers that a random search is not something to be ashamed of. If you are dealing with a more abstract and difficult problem, you should probably not expect to find a methodology that is better than the best available for curve fitting, which is not very good.
This leads to my big complaint about modern analysis. The language has become fancy enough to give the impression that almost any problem can be solved by someone smart enough to understand the fancy talk. In reality, there is still not a good method for curve fitting. And certainly not a good method for solving problems more difficult than curve fitting.
I think you are speaking about something like genetic algorithm. For me this is a good method, I was use many time and success to have good result (this is the most robust I know (if you have an enouth powerful computer)). But in my opinion, the main complexity is about how you modelize your problem. When you do a curve fitting you know already what parameters you will use and how its will act (or at least you was plan many senarios to check).
That is my point. Even when the problem is specific enough for the curve-fitting parameters to make sense, there is still no good method for evaluating them when the problem is even a little bit complicated (enough complicated so that the gradient method doesn't work). More difficult problems are more difficult to solve and usually require some luck to make someone think of the right thing at the right time.
1. Maximizing the upside, through evolved series of successes -recursive optimization. Imparting momentum to "small wins".
2.Minimizing the downside by planning for smaller failures-embedded sub-optimization. Localizing and containment of "blowups".
Though bad for publicity, failing more often, and faster, while learning from it seems to be the best way to make our solution systems more fault-tolerant and more resilient. To a certain extent the right evolved heuristics, will often outperform hardcore optimization.
The next best option would be to design and embed planned waste, buffers and backup capacity into the system, as an alternative to evolved problem-solving.
Article Building Edge-Failure Resilient Networks
Conference Paper Automatic Discovery of Optimisation Search Heuristics for Tw...