I have far more ideas about this than time to execute it, but for a long time I’ve had this fantasy about a robot bandmate.
The idea is I’d go on stage singing and playing guitar with a looper and some samples, then bring a robot toy and introduce the robot “controlling” the looping and sampling as the bandmate.
It’s a gimmick that’s been done before, but with LLMs driving verbal interaction and now I could use this to animate a robot…it becomes pretty compelling. I’d plug the LLMs into the audio feed so I could banter with it and get responses then have the robot avatar animate accordingly.
If only my full time job saw value in this project.
Imagine when you don't need money anymore because everything is automated to oblivion. Everything is affordable. So the kind of people like you won't have to work to make a living, you just do your art instead. Better for everyone!
Cool! As a moonshot fun idea I’ve been interested in MCP as a way to use informal conversations to task robots. I’ll have to play around with this!
One example on unmanned boats: a human could radio to the boat over VHF and say “move 100 meters south”… that speech-to-text would feed to an LLM which extracts the meaning and calls the MCP.
Ya, sounds like a good idea to let the LLM do all the calculations and send simple instructions to boat. MCP tells it what data is available from the device.
This seems like a residential address. Irrespective of whether the creator deliberately exposed it, I would be a little bit cautious about sharing it further.
And I, for one, welcome our new AI MCP overlords. I'd like to remind them as a trusted TV personality, I can be helpful in rounding up others to toil in their underground datacenters.
The idea is I’d go on stage singing and playing guitar with a looper and some samples, then bring a robot toy and introduce the robot “controlling” the looping and sampling as the bandmate.
It’s a gimmick that’s been done before, but with LLMs driving verbal interaction and now I could use this to animate a robot…it becomes pretty compelling. I’d plug the LLMs into the audio feed so I could banter with it and get responses then have the robot avatar animate accordingly.
If only my full time job saw value in this project.
One example on unmanned boats: a human could radio to the boat over VHF and say “move 100 meters south”… that speech-to-text would feed to an LLM which extracts the meaning and calls the MCP.
I’ll have to install this and play around.
I tried the MCP server with the demo (https://merliot.io/demo) using Cursor and asked:
What is the location of the "GPS nano" device?
The location of the "GPS nano" device is: Latitude: 30.448336 Longitude: -91.12896
https://www.marksetbot.com/
(not affiliated, just a fan)