Weeknotes 271 - Large Action Model to connect artificial living with real outputs

This week, CES is a trigger for thoughts on AI and the physical world. And more on mundane robotics, embedded intelligence, and wandering humans.

Weeknotes 271 - Large Action Model to connect artificial living with real outputs
Interpretation of a device for connecting artificial living with real outputs by Midjourney

Hi, y’all! Good to have you here again. And especially welcome to new subscribers that find the newsletter. Thanks, Erwin, for the shout-out. I am following Erwin for so long, he is one of the OG innovators who is not talking about it but doing it. He has a special gift to find the latest and most interesting tools just to start building. Check his newsletter (in Dutch).

This week CES is dominating the news. This good old Consumer Electronics Show is balancing the traditional consumer tech (screens, appliances) with a sense of the new hot thing. Like there is AI in all. Sometimes (or better, often) silly with weird ChatGPT integrations that do not make sense. AI vapourware. However, it is an interesting indication of the merging of intelligence in the physical world. The Verge does nice coverage in their podcast and videos. And this MKBHD Studio for step through the camera perspective. Oh, and robots seem to become a thing in the home.

Triggered thoughts

I cannot avoid the Rabbit R1 thing that tries to create a new interface to LLM in a communication device. Or more as an action device, as they like to frame it. LAM is a Large Action Model to support you with all kinds of actions you would like to take. In an Apple like keynote (but shorter), it is showing all the usual suspect tasks to prove their usefulness (creating recipes from your fridge content, planning a holistic holiday itinerary. That is not, per se, convincing. There is also a lot to oppose the claims. Benedict Evans wrote a good column in its (paid) newsletter. He doubts that all service providers like the Ubers are allowed to break their business models by surpassing through these kind of devices. Another argument references poor service design:

There's also a risk of repeating some pretty basic fallacies. Some people in tech are bored with apps and bored with smartphones, and want to try new things, so they make a new device, even though the entire Rabbit demo was actually an app connected to a cloud service. You can say ‘we’re bored of apps!’ and Humane can say ‘we hate looking at screens!’ but I’m not sure any normal person feels like that.

It is another example of biopic product development, imho. It is another iteration of an Alexa, in a way. That said, I like some of the design choices or explorations in the physical device merged with the service. The Teenage Engineering touch in industrial design is strong. It tries to make a new form combining different interfaces of human and machine intelligence in a small pocket-sized device. In contrast to the Humane device that focuses on voice and projections, this is much more tangible, both in the interactions with the push-to-talk button and visible rotating eye, as having the cute bunny (Rabbit) as interpreter. It is one part of the typical Asian design style to use comic-like representations. You also see it in many car interfaces that originate in China (Nio, Smart #1). Japan was already famous for it but did not export it beyond their own country so much.

The other interesting take is the mentioned LAM, a large action model that is trying to create a new framing to use LLMs in a physical context. That will be super important. One of the biggest questions will be if we will indeed start using different devices or if it will be part of the next-gen phones. If they can live up to the promised price (199 dollars with no extra subscription), they may become part of a mix of devices we will have in the future. As we now have iPhones and Airpods, we might have a different set, and the battle for the share of the device ecosystem will be near. The podcast Dithering has a similar take on this ecosystem of devices. And who knows, should we see the reorganisation of the hardware division of Google also signaling towards this?

The device sparked a lot of excitement that is for sure, the first batch is sold-out (pre-order for Easter delivery (easter - rabbit?).

Events to track

Notions from the news

So it is remarkable to notice how the happening of a consumer tech fair as CES is changing the focal point of the news. Robots and AI are everywhere, in a mundane way, as part of everyday objects and things. At the same time, silly things (aka gadgets) are back. Sometimes, it is hard to decide which category it fits.

Some of the interesting robots can be found in the multiple overviews. Robots are buddies or companions, like Ballie of Samsung. Lenovo has a companion for lonely people. But there are, of course, also more functional-focused robotic devices.

All the Robots We Met At CES 2024, in One Place
Improvements in AI are making robots more fun than ever.
I saw Samsung’s Ballie robot assistant at CES, and it actually seems helpful
This bowling ball-sized robot will be able to double as a projector, smart home hub, and more.
AI at CES, The Rabbit R1
CES is all about AI, which now describes everything. Then, the Rabbit R1 points to a future of hardware designed to lower the invocation cost of AI.

Someone framed it as that in this year’s CES “everything connected” is replaced with “everything AI”. You can read this as another short-term fad, but it might be changing the way we use things for real. Like shopping or having new products that are even more personalised.

The ultimate overview with a positive impression of the level of interesting things.

213. CES 2024: Builders, Makers, Risk-Takers
With a maturing supply chain for connected, low-power, inexpensive devices #CES2024 was the best in a long time. From screens to homes to health, innovations abound from around the world.

Thousands of AI experts are torn about what they’ve created, new study finds
The very confusing landscape of advanced AI risk, briefly explained.

And Apple showed how to ‘steal the show’

Apple won the CES headset game without showing up
The impending Vision Pro loomed over the event.

Roles for AI

Multimodality in AI is a thing in 2024. And teaming up with AI as a team member is the second or other side of the coin.

Multiple AI models help robots execute complex plans
Multimodal AI system uses models trained on language, vision, and action data to help robots develop and execute complex tasks.
OpenAI debuts ChatGPT subscription aimed at small teams | TechCrunch
OpenAI has launched ChatGPT Team, a new subscription offering for ChatGPT that’s aimed at small teams (up to 149 people in size).
Prepare for AI-powered ‘agent ecosystems’ that will dominate tomorrow’s services
Agent ecosystems will develop as AI evolves from performing singular tasks to supporting agents that can work with one another.

Is LLM good as a lie detector? Or ‘truth’ detector? It will need serious scrutiny, though

Is AI the New ‘Truth’ Detector?
AI is revealing the hidden depths of conversations—for better or worse.
Medical AI falters when assessing patients it hasn’t seen
Physicians rely on algorithms for personalized medicine — but an analysis of schizophrenia trials shows the tools fail to adapt to new data sets.

And will it trigger new computing platform architecture?

A New Compute Platform for Generative AI ?
Is generative AI big enough to spark the creation of a new compute platform?

And the impact on productivity, what it will bring is not sure yet

The generative AI productivity boom is coming, just don’t try to guess when
The task is daunting considering the complexity and sophistication of the models deployed, the wide range of applications they serve and the inherent uncertainty about how they will evolve.

The OpenAI store for custom AI bots is there. At the same time, Sam Altman is ‘acting’ being conscious of the impact; a proven strategy to stress potential impact. AI experts are sincerely confused.

OpenAI launches a store for custom AI-powered chatbots | TechCrunch
OpenAI has launched the GPT Store, a marketplace where users can find and use AI-powered chatbots customized for specific use cases.
Sam Altman says it’s ‘potentially a little scary’ how quickly society will have to adapt to the AI revolution
Sam Altman said on Bill Gates’ podcast that it’s “potentially a little scary” how quickly we’ll need to adapt to changes brought by AI.

“AI is a lever that becomes a lens”. It's always nice to have Dan Shipper’s reflections.

ChatGPT and the Future of the Human Mind
AI is a lever that becomes a lens

What happened to the metaverse? AI took over rapidly.

How AI Replaced the Metaverse as Zuckerberg’s Top Priority
Meta’s founder has become deeply engaged in his company’s AI efforts ahead of its 20th anniversary, but his close attention hasn’t always proved to be a recipe for success.

Mundane robots

Yes, some humanoids are becoming more serious products. And in kitchens. And more these robots are robot things more than traditional robots.

1X Secures $100M in Series B Funding
1X is pleased to announce it has raised $100 million in Series B funding with participation from EQT Ventures and other notable global investors.
Steak Toasters, Indoor Smokers and Robot Cocktails: All the Kitchen Tech of CES 2024
CES is more than phones, laptops and TVs -- check out the most interesting cooking tools and gadgets we saw this year.

Autonomous driving

Is autonomous driving still happening? It might, but slower. Or is the AI in cars not, per se, focused on self-driving after all? It was a theme at CES.

Autonomous driving is ‘happening’, but slower than expected
To the believers, the oft-promised autonomous car revolution is “clearly happening”—they point to the myriad displays at the Consumers Electronics Show in Las Vegas that defy the industry’s bad headlines.
Forget self-driving cars — CES 2024 showed us how AI will truly benefit drivers
AI is coming to cars, but not in the way you might expect
CES 2024 was a showcase for how to shoehorn AI into next-generation cars
CES 2024 in Las Vegas underlined that future mobility will be shaped by AI, like it or not, as intelligent assistants emerge to guide, plan and converse with their human cargo

What is more ‘disturbing’ from this Cyberrunner experiment: that it is so quick or that it is cheating to achieve a better outcome?

AI Robot Bests Marble Maze Game
It’s a trip watching how fast CyberRunner can run a marble through this wooden labyrinth maze. Labyrinth and its many var

Digital life tooling

Good to see Tom Coates back to blogging. I need to dive into the details of the long post, but curious to find out more about the intentions of Meta for creating a Fediverse integration of Threads.

How Threads will integrate with the Fediverse – plasticbag.org

Too bad: Artifact is shutting down. It became part of my mix of sources and tools to discover news articles for this newsletter.

Shutting down Artifact
We’ve made the decision to wind down operations of the Artifact app.

The impact of 15-minute cities can be different than expected, at least in London.

Ministers prioritised driving in England partly due to conspiracy theories
Exclusive: Documents show shift in transport policy influenced by unfounded fears about loss of freedom of movement in ‘15-minute cities’

Paper for the week

This seems to be even more relevant, although written in the pre-LLM wave.

People’s Councils for Ethical Machine Learning

The family of machine learning methods is not somehow inherently bad or dangerous, nor does implementing them signal any intent to cause harm. Nevertheless, the machine learning assemblage produces a targeting gaze whose algorithms obfuscate the legality of its judgments, and whose iterations threaten to create both specific injustices and broader states of exception. Given the urgent need to provide some kind of balance before machine learning becomes embedded everywhere, this article proposes people’s councils as a way to contest machinic judgments and reassert openness and discourse.

McQuillan, D. (2018). People’s councils for ethical machine learning. Social Media+ Society4(2), 2056305118768303.

See you next week!

We had a very successful end of the Wijkbot/Hoodbot project as part of the Afrikaanderwijk Coöperatie celebrations. The Wijkbot was playing an important role opening the new Grondstoffenstation. The CityLab010 project has ended, but the intention is to continue developing the urban robot platform within the neighbourhood and beyond. We will be shaping plans this month. The same is true for the other Cities of Things project for a Collect|Connect Community Hub. I will give that more focus in the coming time. A new graduate student from IDE is creating a deeper concept of the intelligent layer in the coming semester. I will keep you posted! And I will share more on some other projects I am involved in for creative industries programs.

Have a great week!

Buy Me a Coffee at ko-fi.com