Figure Status Update - AI Trained Coffee Demo

401,481
0
Published 2024-01-07
Figure 01 has learned to make coffee ☕️

All Comments (21)
  • @loganl7257
    What's really impressive was the guy drinking the coffee straight from the machine without burning himself.
  • @Manatek
    It's about time someone made a fully automatic coffee machine
  • For those that may not have a strong understanding of AI but are interested: If this is indeed end to end neural networks, that would mean the entire process was created using models that understood motor movement, balance, and dexterity. Another model for the vision - the man set a coffee machine on the table and the robot identified it. Then another model for the audio - he asked for a cup of coffee and it translated that to an objective and movement. This is just a guess, I do not know their architecture. However, if all of that was trained in 10hrs then it is incredibly impressive.
  • @USER-ruzer2000
    What smooth movements. The eyes refuse to believe that this is a real material robot, and not computer graphics.
  • @Dryer_Safe
    The human: make me a coffee Figure: turns to Keurig: make him a coffee.
  • @flavb83music
    Imagine having this robot in your kitchen, on night, in this position while waiting for the coffee to be done. Creeeep
  • @christie5425
    Welcome to the future! Just figured out and I’m in love with this technology. Want to have my own Figure 🤩😍
  • @fire17102
    Wow awesome job ! Coming hot after Mobile Aloha, nice work with the corrections :)
  • @napalmqero2689
    Oh maaan i want this one!!! And the design... Realy want this coffee machine now.
  • @jhunt5578
    Nice demo! Thanks for showing the progress.
  • @JigilJigil
    Keep up the good work and keep us updated with more videos.
  • @mkjyt1
    future is going to be awesome!!!
  • @JMeyer-qj1pv
    That's impressive! It understood a voice command, recognized the objects, and was able to manipulate them to complete the task. I noticed you placed the cup in the coffee maker for it, so I guess it isn't quite dexterous enough to do that yet. I think having pressure sensors on the fingertips might help it to do things like that. More multimodal inputs seems to help the AI. Keep up the good work. Once we get faster processors I bet the movements will be faster and more fluid. Would be nice to have 2 DOF in the neck so the robot could move its head to look at what it is doing, and then people would intuitively know what the bot is focused on. I think it's a little off putting for people when the bot just stares straight ahead all the time.
  • @tom_skip3523
    Keep it up please! The progress amazes me. Huge potential
  • @TastyAsparagus
    My robot dispenses coffee into my mouth with a romantic kiss.
  • @sausage4mash
    Pretty cool that the robot learned to make coffee just by observation! I wonder, did it require thousands of examples for training, or was it a one-off learning? The devil is indeed in the details. Also, its dexterity was quite impressive. It seemed to react to the situation in real time, which adds another layer of sophistication.
  • @dpwhittaker1
    So the only object the robot needed to recognize was the k-cup sitting isolated on the table, the handle, and the start button. The human had to place and retrieve the cup. What I want from my personal coffee-making robot: it gets the mug out of the cabinet full of breakable mugs, puts it in the coffee-maker, selects the particular roast I want from the cabinet, which might require shifting several other boxes around, and/or opening a new box from the pantry, pulling out a K-cup, replacing the box, loading and running the coffee maker, pulling the k-cup out and throwing it away, pulling a splenda packet out of the bowl, opening them and adding them to the coffee, locating the creamer in the refrigerator and adding the precise amount to the coffee, stirring the coffee, bringing me the mug in bed without spilling a drop, locating the empty mug later wherever I happen to leave it, bringing it back to the kitchen, washing and drying it, and placing it back in the cabinet for tomorrow. The robot can't make coffee yet. It can pick-and-place one part in another purpose-built robot that can make coffee and turn that robot on.