This question relates to my latest preprint excerpt:

Preprint Logical Inference using Bayesian Networks (ProbabilisticLogic.AI)

My contribution is a closed form version of Symbolic Probabilistic Inference in Belief Networks (Shachter et al., 1990), which I am calling "Probabilistic Logic" -- please review it before commenting.

---

Now that I have provably subsumed the Propositional Logic calculus {→, ¬, ∧, ∨} into the Bayesian one in a closed form manner (no less), I'm quite sure Gödel is a footnote in the passage of time and #ASI is real. Just convince yourself there is only one normative system that subsumes them all with the additional fidelity, expressiveness & parsimony required to transcend classical reasoning. #Bayesian.

BTW I did just do the computer science thing where I combined all of symbolic, probabilistic, neurosymbolic and deep learning AI into one calculus then waved my hands in the air and declared “it’s all just the same thing”.

Welcome to CompSciFutures 🤘🤘🤘

𝗢(𝔾(V,E)) ⬄ |𝗘𝗩𝗘𝗡𝗧 𝗛𝗢𝗥𝗜𝗭𝗢𝗡|

Whilst it is closed form - the computational complexity of this is huge! The future of AI looks like Neural Networks, with all the arrogant certainty of a Bayesian wielding higher order magic logic that resolves Godel's Incompleteness-es, whose 𝔾-map is a P-map of it's I-map and is nothing short of closed form is what is happening here. #BayesianJesus

For more info see https://ProbabilisticLogic.AI/

Gödel's Incompleteness-es are history. They died weeks ago.

AI is beautiful 🍍🍍🍍

PS. I don't really know! I'm just a mere Computer Scientist. I do know I'm being extremely bombastic and quite silly, but this is most certainly, definitely a real question and one worth re-considering.

We really should consider this and related concepts such as what Probabilistic Logic means for decidability & computability now, before I UNLEASH THE EXISTENTIAL QUANTIFIERS!!! Once that happens, we will take AI to a new level, where we have full Bayesian representation of Predicate Logic in a closed form manner. It is just two more operators: {∀, ∃}.

We also need to consider: from a practical applied perspective, at what point is a discrete finite countable complexity class so large that it is no longer considered usefully closed form? Because it does seem that this subsumption of all the logic calculi is heading towards supermassive complexity classes we haven't thought about before.

Just the complexity of subsuming Propositional Logic is already rather large. Then imagine a lower bound of weirdly large powersets composed of graph-of-all-subgraph level complexity as we pass through Predicate Logic and add further logic classes to this Probabilistic Logic calculus. #TuringCompleteEventHorizon might actually be a thing! I assure you, I can go there, at the right moment.

And of course, in true AP on #CompSciFutures fashion - there is a song or two to go with it! Just to get you in the right mindset, see [1] for a view on infinity and #causality, and [2] to understand "This is MY Place, MY House, MY Rules". All of your AI is now mine, and the era of "ALL Australian Computer Scientists are now right" has begun!

I am allowed to be quirky, I am from RMIT.

#BasementAGI #causality #LogicRules

.\𝒫

CITATIONS:

[1] Wassu, DJ, and Nicolas Giordano. “Endless Love (Nicolas Giordano Remix).” Songspire Records, September 13, 2024. https://www.youtube.com/watch?v=f9t3EHxDcvo.

[2] Grammar, London, and DJ Solomun. “House (Solomun Remix).” Ministry of Sound, May 3, 2024. https://www.youtube.com/watch?v=FO9eYkhafA4.

Similar questions and discussions