🚗🍏 Apple's Windowless Future

Plus: 🧠 🌊 Meta's new AI can generate images from brain waves

Must-reads

Apple Eyes Generative AI Integration with iOS 18 by Late 2024

Apple is potentially diving deep into generative AI, with plans to bring it to iPhones and iPads as early as late 2024. Analyst Jeff Pu suggests a fusion of cloud-based and "edge AI", with a focus on personal data privacy. Rumors point to an integration of large language models with Siri for task automation and a tighter bond with the Shortcuts app. While OpenAI's ChatGPT kindled the generative AI flame, Apple's entry could redefine the space, but with caution on potential misuse and biases, as stated by CEO Tim Cook. (MacRumors) 3-minute read

Apple's Vision Pro Could Revolutionize the Apple Car Experience

Apple has been granted a patent that envisions using its Vision Pro headset to enable passengers to see outside of an Apple Car without windows. This innovative approach would not only provide virtual views from external cameras but could also offer augmented reality experiences or transport passengers to entirely different environments matching the car's movements. While this concept has its entertaining perks, Apple sees potential applications in reducing motion sickness and enhancing productivity during journeys. (9to5Mac) 2-minute read

Google Cloud and MultiversX Collaborate on Metaverse-focused Web3 Projects

Google Cloud's collaboration with blockchain firm MultiversX strengthens its Web3 reach. This partnership facilitates powerful analytics within Google's ecosystem for Web3 projects. MultiversX has previously partnered with Audi’s Holoride and ICI D|Services. They've also unveiled enhanced features for their decentralized wallet, xPortal. (Cointelegraph) 4-minute read

Macy's Mstylelab: Bridging Real & Virtual Fashion

Macy's unveils Mstylelab, a digital platform that merges real-world fashion with the metaverse. Users can create personalized digital items and navigate a virtual New York, showcasing the new "On 34th" collection. Partnering with Journee, Macy's aims to redefine shopping experiences, making them immersive and interactive. As winter approaches, the platform will feature a metaverse rendition of Macy's Thanksgiving Day Parade. (Venturebeat) 2-minute read

Nvidia's Eureka: The AI Master Trainer for Robots

Nvidia has unveiled "Eureka," an AI agent powered by OpenAI's GPT-4, designed to autonomously teach robots intricate skills, marking a significant step in reinforcement learning. Beyond guiding a robot to master pen-spinning, Eureka extends its prowess to a wide range of tasks, improving efficiency over traditional trial-and-error methods. Its integration with Nvidia's Isaac Gym allows the public to experiment with its capabilities, underscoring the growing potential and impact of AI agents in various fields. (Venturebeat) 6-minute read

Farewell to Reddit's Blockchain Community Points

Reddit will phase out its blockchain-backed Community Points system. Introduced using the Ethereum blockchain, these community-specific tokens like "moons," "bricks," and "donuts" saw their values plummet dramatically, even causing some users to report massive financial losses. The official reason? Scalability issues and regulatory complexities. With its November 8th end date, Reddit aims to pivot towards broader-reaching projects, like the new Contributor Program that'll convert Reddit gold into actual currency. But fans of blockchain shouldn't despair; Reddit continues to support its NFT-based Collectible Avatars. (The Verge) 4-minute read

US FCC Greenlights 6GHz Frequency for AR/VR Metaverse Tech

The United States Federal Communications Commission (FCC) has approved the use of the 6GHz frequency band for low-power AR and VR devices, pivotal for metaverse advancements. This move promises faster speeds, more bandwidth, and reduced latency. The decision supports big tech ventures into AR/VR, with products like Meta's Quest 3 and Apple's forthcoming Vision Pro set to benefit. (Cointelegraph) 3-minute read

Spotlight 🔦

Meta AI Unveils Real-time Image Generation from Brain Waves

Meta AI introduces a breakthrough system that uses magnetoencephalography (MEG) to nearly instantaneously decode and reconstruct visual perceptions in the brain. This development could usher in the future of non-invasive brain-computer interfaces, especially benefiting those with speech impairments due to brain injuries.

Key Components:

  • Image Encoder: Independently represents an image.

  • Brain Encoder: Aligns MEG signals with image representations.

  • Image Decoder: Crafts a likely image based on brain data.

DINOv2, an advanced self-supervised AI model, proved most compatible with brain signals, echoing that self-taught AI aligns closely with human neural activations. While MEG decoded images aren't flawless, they capture the essence of perceived visuals with exceptional speed.

This innovative work builds on Yann LeCun's (Director of Research at Meta AI) prior endeavours in human-like AI, highlighting AI's potential to closely mimic human cognition.

(meta) 6-minute read

Recently funded startups 🦄 

Quest Portal raises $7.6M and launches subscription for tabletop role-playing games

Copresence raises $6M for 3D avatar creation platform

Upland raises $7M for virtual land Web3 metaverse

Oxolo raises €13M for Gen AI-driven video platform

DPU developer, MangoBoost, has raised a $55 million Series A for its DPU hardware and software solutions

Research 🔬

Vision-Language Models are Zero-Shot Reward Models for Reinforcement Learning

Quick Bytes ⚡️

What to Expect From the AI World Fair in Decentraland

Apple to host ‘secretive’ Vision Pro training early 2024

Inside the underground world of black market AI chatbots

The beginning of outdoor mixed reality?

Deepfakes made simple with AI: an in-depth look at FaceFusion

I did my expenses in VR and I liked it

What we’re reading

How to Build Your Own AI-Generated Image with ControlNet and Stable Diffusion

Scientists believe that your consciousness can interact with the whole universe

The Lessons of Lucasfilm's Habitat

How Meta and AI companies recruited striking actors to train AI

Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data

TOGETHER WITH METALYST

Comments, questions, tips?

Send a letter to the editor –– Email  or tweet us

Hit the inbox of readers from Apple, Meta, Unity and more

Advertise with MetaLyst to get your brand or startup in front of the Who's Who of metaverse tech. Our readers are folks who love tech stuff – they create things, and invest in cool ideas. Your product could be their next favorite thing! Get in touch today.

Was this newsletter forwarded to you, and you’d like to see more?

Join the conversation

or to participate.