ChatGPT is remarkable

I've been trying to get some Art "commissioned" by one of the Art ML tools but it kept drawing the masts for the sailing boats in the wrong place. I then realised it was drawing catboats and cat ketches. I've never heard of them before, but saw one the other day. Looked a bit like this

Freedom44.jpg


The difference between Alexa and ChatGPT, is ChatGPT can learn context, so it's true ML. Alexa is just AI.
 
but it kept drawing the masts for the sailing boats in the wrong place.
That reminds me of this recent article from Hackaday...


For a language model based AI to be able to 'create' a 3D based model, is definitely an interesting concept!
 
The maths to describe the flight of a ball is indeed fiendishly complex
You think so?
It's basically a parabola, which mathematically is trivial. Add stuff in for air resistance, any spin, maybe more, like the variation in "g" through the flight - there's no mystery there, it's only Newtonian physics. That's how radar can track an incoming shell and fire one back which hits the aggressor's muzzle.
Close to what table-tennis or badminton playing drones use. Analyze, predict, react - using learned info.
Maths is just procedures, "easy" for mathematicians. Getting your drone in the right place to bat the ball back at the right angle and speed is much harder maths than knowing where a ball's going.

It seems woo-woo magic to anyone who's never come across how it works - which would usually be through formal education, innit.
Dealing with associative, abstract, conceptual, and other relations is much harder; which goes into psychology (and more) to describe, understand and reuse the ways people do something. Once understood, that can be programmed. Same with morality, or a load of other disciplines not normally considered by engineers programming how, say, drones fly.
It's understandable to think something is beyond AI, if one doesn't happen to have learned at least in principle, how the something, works.

AI can do a lot better, partly because it learns, building a library of axiomatic procedures which it can try, in large numbers, then assess the effects, and then try them in combination. Those rules may not have a parallel with the way humans work things out.
And it doesn't forget.
 
You think so?
It's basically a parabola, which mathematically is trivial. Add stuff in for air resistance, any spin, maybe more, like the variation in "g" through the flight - there's no mystery there, it's only Newtonian physics. That's how radar can track an incoming shell and fire one back which hits the aggressor's muzzle.
Close to what table-tennis or badminton playing drones use. Analyze, predict, react - using learned info.
Maths is just procedures, "easy" for mathematicians. Getting your drone in the right place to bat the ball back at the right angle and speed is much harder maths than knowing where a ball's going.

It seems woo-woo magic to anyone who's never come across how it works - which would usually be through formal education, innit.
Dealing with associative, abstract, conceptual, and other relations is much harder; which goes into psychology (and more) to describe, understand and reuse the ways people do something. Once understood, that can be programmed. Same with morality, or a load of other disciplines not normally considered by engineers programming how, say, drones fly.
It's understandable to think something is beyond AI, if one doesn't happen to have learned at least in principle, how the something, works.

AI can do a lot better, partly because it learns, building a library of axiomatic procedures which it can try, in large numbers, then assess the effects, and then try them in combination. Those rules may not have a parallel with the way humans work things out.
And it doesn't forget.
A maths bod explained the way Beckham would take a free kick, using graphs, charts and complex equations that made it look very difficult.
All David would do is place the ball down and do this...


...and make it look easy.
 
Back
Top