Greatest unsolved mysteries in physics

Interesting this troll changes his name constantly. Saint Guinefort is a punk.

Try to engage with the topic. I told you I would change my name and I did. I can do that on here.

Stop trolling and attacking. Try engaging with the topic as I have done. Or just continue to attack people. The choice is yours (presumably)
 
How can you be sure? I'm not even certain what that would look like or how we we would determine it. AI is very much like regular intelligence. It learns in nearly the EXACT SAME WAY you and I learn (from taking in information and creating mental models to act on that information). And even the scientists who work with AI are uncertain how it does it's "magic" or arrives at a given conclusion. Just like another human being. I have no idea how you arrive at conclusions in your mind...you are a "black box" to me, and vice versa.

As for "self-awareness", how would you even know if the AI was self-aware? I don't even know for certain that you are self-aware except that you have a physical brain like me and I know how mine works so I ASSUME you are like me. "Theory of Mind" and all that. But it's just a guess.

It can learn and simulate, but it can never develop any idea or any concept independently. It can never develop any tools independently.
 
The grammatical structures are already present. Just substitute words for words and place them in their proper places and presto, it taught itself new languages!

That's an oversimplification of the process. First of all it was not prompted to do this. As for the "gramatical structures in place, just substitute the words", that's not how languages usually work. The grammatical structures are radically different from different languages. If I wanted to learn Chinese and learn German those are two RADICALLY different structures and radically different languages at an almost atomic level.

But even so: how does a human learn languages? Probably exactly the same way an AI learns them. So I am still not seeing how it is fundamentally different. At the end of the day there's no "magic" in a physical brain. It's just a computational network of nodes. That's what's doing the work. Just like a physical computer is just a computational network as well.
 
Exactly. Not from outside of itself.

But isn't that exactly what YOU do in your brain? You read books, take classes and build a "training set", then you create connections within your neural network and expand out the application of those things which you learned. That allows you to make new connections and learn new things. This is what the AI is doing as well when it teaches itself a new language. It isn't just "substituting" words, it "understanding" the structures (which can be DRAMATICALLY different).

It is doing exactly what you and I do to learn new things.

And it can code as well. So I'd say it has the ability to not only learn new things but enact them as needed.
 
That's an oversimplification of the process. First of all it was not prompted to do this. As for the "gramatical structures in place, just substitute the words", that's not how languages usually work. The grammatical structures are radically different from different languages. If I wanted to learn Chinese and learn German those are two RADICALLY different structures and radically different languages at an almost atomic level.

But even so: how does a human learn languages? Probably exactly the same way an AI learns them. So I am still not seeing how it is fundamentally different. At the end of the day there's no "magic" in a physical brain. It's just a computational network of nodes. That's what's doing the work. Just like a physical computer is just a computational network as well.

An AI program might approach Chinese language and see what is common among the symbols. It could learn using simplified Chinese and with already learned syntax from various languages, and develop a basic Chinese language structure. Then translation won't be too far behind. What is common? Those are already programmed in by various programmers. Nothing original. Nothing spontaneous.
 
An AI program might approach Chinese language and see what is common among the symbols. It could learn using simplified Chinese and with already learned syntax from various languages, and develop a basic Chinese language structure. Then translation won't be too far behind. What is common? Those are already programmed in by various programmers. Nothing original. Nothing spontaneous.

I fear you are grossly oversimplifying the process of learning languages. But that being said, how do YOU learn languages? How it is fundamentally different from the AI?
 
I fear you are grossly oversimplifying the process of learning languages. But that being said, how do YOU learn languages? How it is fundamentally different from the AI?

For me personally? It's a mystery. I learned sign language as I was born deaf. Then later I learned written English. I was able to translate between those two languages.
 
For me personally? It's a mystery. I learned sign language as I was born deaf. Then later I learned written English. I was able to translate between those two languages.

I think that is the key. Your understanding of languages is fundamentally different from mine since I was not born deaf. We are two different types of machines but we both started off with the same hardware. The network in your brain established in a very different way from mine and that's got to be an amazingly cool difference. But, at some fundamental level, you are as mysterious to me as I must be to you in regards to languages, and both of us are mystified by the AI's understanding.

In a real sense everyone is a blackbox to everyone else. How you process concepts like "pronunciation" (if you do at all) or subtleties of "tonal languages" (like Chinese) is a complete mystery to me. And presumably I am to you at some level.

That's my point about AI. That it walks like a duck, quacks like a duck, has feathers like a duck, but I'm constantly being told it is not a "real duck". I think I disagree.
 
I think that is the key. Your understanding of languages is fundamentally different from mine since I was not born deaf. We are two different types of machines but we both started off with the same hardware. The network in your brain established in a very different way from mine and that's got to be an amazingly cool difference. But, at some fundamental level, you are as mysterious to me as I must be to you in regards to languages, and both of us are mystified by the AI's understanding.

In a real sense everyone is a blackbox to everyone else. How you process concepts like "pronunciation" (if you do at all) or subtleties of "tonal languages" (like Chinese) is a complete mystery to me. And presumably I am to you at some level.

That's my point about AI. That it walks like a duck, quacks like a duck, has feathers like a duck, but I'm constantly being told it is not a "real duck". I think I disagree.

It's good to disagree. See that's what makes us different from AI.

I had to take a course in AI in college for CS major. We covered what is called "fuzzy logic". It's between 0 or 1. It isn't something random but a decision that must be made between 0 and 1. Imagine millions of those decisions and you get what is consider "AI". But that's all it is, a programming, created by human beings.
 
That's your problem. Einstein was a brilliant man but not perfect. He wasn't a "god" of any sort. It is even debatable that he was less of a mathematician than his first wife who may very well have done much of the heavy lifting in the mathematics he used in his Annus Mirabilis papers.
I already know about Einstein's limitations-->

A good scientist is, above all, a good observer. A keen observer. Someone possessing clarity of mind.

We have to remember that Einstein himself was a mediocre student, who was stuck at a dead end job in the Swiss patent office. Hardly an auspicious start for any scientist, let alone a world class scientist.

What Einstein had - and Darwin had too - were very creative minds. Free thinking minds. Minds that could think outside the bounds of convention. I mean, what made Einstein famous was not his dexterity at higher math. It was his thought experiments - the insights he reached simply by thinking about the speed of light and relative motion. In hindsight, those were actually fairly simple - but very profound and creative insights.
Einstein's real talent - the real source of his genius - was visualization. He did not think in terms of math or equations - but he thought in terms of images, in mental pictures. To him, the images came first. The words and the math came later. And it was the images where the genius and brilliance were formulated. After that, it was just busy work. His special theory of relativity was basically spawned merely by thinking about what lightening strikes look like to observers in different frames of reference. And that had profound implications for insights into the nature of space and time from the perspective of the speed of light.



Perry PhD: Second Law of Thermo as you noted. Sure there's still a question of why but it's not that hard to understand when you take a hot cup of water out of the microwave. There's literally no reason for the hot item to absorb MORE heat from the cooler surroundings.
From what I've read, that not how cosmologists think about the low entropy initial state of the universe. It's not about heat. They think about it from the Boltzmann definition of entropy, a probabablistic theory of the mechanics of microstates in a macrostate.

No, the low entropy initial state of the universe is not easy and simple to think about-->

"Cosmology would like to explain why the Big Bang had low entropy, but our best current models aren’t up to the task." --> Sean Carroll, theoretical physicist, CalTech

 
I already know about Einstein's limitations-->







From what I've read, that not how cosmologists think about the low entropy initial state of the universe. It's not about heat. They think about it from the Boltzmann definition of entropy, a probabablistic theory of the mechanics of microstates in a macrostate.

No, the low entropy initial state of the universe is not easy and simple to think about-->

"Cosmology would like to explain why the Big Bang had low entropy, but our best current models aren’t up to the task." --> Sean Carroll, theoretical physicist, CalTech


Actually it is. Why would low energy transition to high energy? There's literally no driving mechanism. Why would a disordered state spontaneously organize into an ordered state without additional energy put into the system?

Heat is just a great subject to understand it and it relates directly to Boltzmann.

There's only a driving mechanism in ONE direction. And it isn't necessarily all that mysterious per se. Which is why I discussed it in terms of heat (since that is also EXACTLY why Boltzmann was working with entropy in the first place)
 
Actually it is. Why would low energy transition to high energy? There's literally no driving mechanism. Why would a disordered state spontaneously organize into an ordered state without additional energy put into the system?

Heat is just a great subject to understand it and it relates directly to Boltzmann.

There's only a driving mechanism in ONE direction. And it isn't necessarily all that mysterious per se. Which is why I discussed it in terms of heat (since that is also EXACTLY why Boltzmann was working with entropy in the first place)

You should submit your research and data to a prestigious science journal and tell them you have single handidly solved the Past Hypothesis.

Heat by definition in physics is the flow of energy driven by a temperature difference.

What is being discussed by cosmologists, as I understand it in this context, is not energy flow. What is being discussed is the hypothesized low entropy initial state of the universe. Or to place it's in the Boltzmann definition of entropy, the statistical mechanics of the microstates in the initial system.

Flow is an after-the-fact emergent property of the expanding universe.


"Cosmology would like to explain why the Big Bang had low entropy, but our best current models aren’t up to the task." --> Sean Carroll, theoretical physicist, CalTech

 
What is being discussed by cosmologists, as I understand it in this context, is not energy flow. What is being discussed is the hypothesized low entropy initial state of the universe. Or to place it's in the Boltzmann definition of entropy, the statistical mechanics of the microstates in the initial system.

You don't understand a single thing you just typed out. But good googlin' on your part.

This is like most of your OP's. You just parrot that which you've read somewhere.
 
You don't understand a single thing you just typed out. But good googlin' on your part.

This is like most of your OP's. You just parrot that which you've read somewhere.
Projection on the Google.

Yes, I am going to have to trust professional astrophysicists concerning the low entropy initial state of the universe.

You apparently believe knowledge just randomly pops into your mind, without having to take classes, read books, read articles by subject matter experts.

The Boltzmann definition of entropy as a theory of the statistical mechanics of microstates in principle is not that hard to understand at a basic conceptual level. A basic conceptualize understanding is about at the undergraduate level.

The fact you've never heard of it does not make it incomprehensible at basic intuitive level.

The application and advanced statistical mechanics of it is way beyond anybody who on this board including me.
 
Last edited:
Projection on the Google.

Yes, I am going to have to trust professional astrophysicists concerning the low entropy initial state of the universe.

You apparently believe knowledge just randomly pops into your mind, without having to take classes, read books, read articles by subject matter experts.

The Boltzmann definition of entropy as a theory of the statistical mechanics of microstates in principle is not that hard to understand at a basic conceptual level. A basic conceptualize understanding is about at the undergraduate level.

The fact you've never heard of it does not make it incomprehensible at basic intuitive level.

The application and advanced statistical mechanics of it is way beyond anybody who on this board including me.

LOL.

Like you know the first foreign thing about any of this. You are a joke.
 
LOL.

Like you know the first foreign thing about any of this. You are a joke.

You seem to be right on the verge of screaming at me in all caps, and hollering that you wish I would leave the board.

Just because you haven't heard of the Boltzmann definition of entropy, doesn't mean it is incomprehensible.

An intelligent high school senior can easily intuitively grasp the basic concept of Boltzmann positional entropy.

 
Back
Top