Mashable

Syndicate content
Mashable is a leading source for news, information & resources for the Connected Generation. Mashable reports on the importance of digital innovation and how it empowers and inspires people around the world. Mashable's 25 million monthly unique visitors and 10 million social media followers have become one of the most engaged online news communities. Founded in 2005, Mashable is headquartered in New York City with an office in San Francisco.
Updated: 9 min 4 sec ago

Nvidia’s new Vera Rubin chips: 4 things to know

Wed, 01/07/2026 - 00:14

Nvidia CEO Jensen Huang announced at CES 2026 in Las Vegas this week that its new superchip platform, dubbed Vera Rubin, was on schedule and set to be released later this year.

The news was one of the key takeaways from the highly anticipated keynote from Huang. Nvidia is the dominant player powering the AI industry, so a new line of chips is obviously a big deal. Here are four things to know as we await Vera Rubin's drop later this year.

1. There are 6 new chips across the Rubin platform

Nvidia introduced six chips on the so-called Rubin platform, one of which is the so-called Vera Rubin superchip that combines one Vera CPU and two Rubin GPUs in a processor.

"Rubin arrives at exactly the right moment, as AI computing demand for both training and inference is going through the roof," Huang said in a statement. "With our annual cadence of delivering a new generation of AI supercomputers — and extreme codesign across six new chips — Rubin takes a giant leap toward the next frontier of AI."

2. The new line of chips is aimed at big companies

Massive AI companies will look to package different parts of this new line of chips together to make massive supercomputers that power their products.

SEE ALSO: CES 2026 live updates: See the latest news, surprises, and strange tech from LG, Samsung, Lego, and new startups

"These huge systems are what hyperscalers like Microsoft, Google, Amazon, and social media giant Meta are spending billions of dollars to get their hands on," wrote Yahoo.

3. We're not exactly sure where production is on the Vera Rubin

Nvidia assured the public the chips were set to be released this year, but when, exactly, remains unclear.

"Typically, production for chips this advanced—which Nvidia is building with its longtime partner TSMC—starts at low volume while the chips go through testing and validation and ramps up at a later stage," wrote Wired.

There had been rumors of delays, so the announcement at CES seems aimed at quelling those fears.

4. The chips should make AI more efficient

Nvidia has promised the Vera Rubin superchips are powerful and more efficient, which should, in turn, make AI products relying on them more efficient. That's why major companies will likely be lining up to purchase the new line of products. Huang said the Rubin chips could generate tokens — the units used to measure output — ten times more efficiently.

We're still waiting to get all the details — and to see when the chips actually hit the market — but the announcement certainly was a major bit of AI news out of CES.

Head to the Mashable CES 2026 hub for the latest news and live updates from the biggest show in tech, where Mashable journalists are reporting live.

Categories: IT General, Technology

Razer shows off Project Motoko, an AI-powered gaming headset with wild features

Tue, 01/06/2026 - 23:25

CES is all about wacky concepts, and Razer brought one of its own to the 2026 showcase.

The gaming accessory giant unveiled a concept gaming headset called Project Motoko that does more than just deliver audio and accommodate voice chat with friends. Much more, in fact. Motoko is actually a wireless AI wearable that's compatible with all the big AI systems like Gemini and OpenAI and does a lot of the same things that a pair of smart glasses could do. Just, you know, in the form of a headset instead of glasses.

SEE ALSO: 8 gadgets from CES 2026 that you can buy right now: Dell, Xreal, Soundcore, more

By that, I mean it can use first-person cameras positioned on the front of the device to recognize objects and text in real-time, with language translation and document scanning provided as examples by Razer.

It's also got multiple on-board microphones for recognizing voice commands for whichever AI assistant you feel like using. It really does seem, feature-wise, like it's got parity with any number of AI-powered smart glasses that are on the market today, but just in a different form factor.

Of course, the difference is that those glasses are actually on the market and Project Motoko is not. This is just a concept, with no release date or price at the moment.

Head to the Mashable CES 2026 hub for the latest news and live updates from the biggest show in tech, where Mashable journalists are reporting live.

Categories: IT General, Technology

I tried Neurable’s brain-sensing headphones at CES

Tue, 01/06/2026 - 21:55

Neurable’s pitch at CES 2026 is bold: what if performance tracking didn’t just include your mouse, keyboard, or heart rate, but also included your thoughts?

That idea is now packed into a pair of chunky-but-surprisingly-comfortable gaming headphones, built in partnership with HP's HyperX brand. Inside the headset are EEG sensors designed to read brain signals in real-time, allowing Neurable’s software to track focus, cognitive load, and reaction speed while you play.

Credit: Chance Townsend / Mashable

I tried Neurable’s neurotech headphones during a private demo with the team inside the Palazzo, away from the chaos of the show floor. The headset features thick earcups and fabric padding that are designed to conceal EEG sensors without resembling lab equipment.

SEE ALSO: CES 2026 live updates: See the latest news, surprises, and strange tech from LG, Samsung, Lego, and new startups

It should be noted that Neurable didn’t start in gaming. Much of the company’s underlying tech was developed in academic settings and tested with the Department of Defense, including applications for monitoring brain health after blast exposure.

The headset supports live metrics for streamers and coaches, including focus, cognitive speed, and “brain battery,” a measure meant to indicate when you’re mentally fatigued and should probably take a break.

Before any "performance boost" happens, the system establishes a baseline. Sitting at a demo station, I watched a live graph respond to nothing more than my thoughts: focusing pushed the line upward, distraction pulled it back down. No calibration session, no gel caps, no wires running across my scalp — something CEO Ramses Alcaide emphasized as a major hurdle Neurable claims to have solved using AI-driven signal processing.

SEE ALSO: CES 2026: AMD says 'You ain't seen nothing yet' on AI

From there, the demo moved into Aimlabs, a familiar FPS training tool used by esports players to measure accuracy and reaction time. The goal is to hit as many targets as possible in a fixed time window. My first run went well, though not spectacularly, and was also hindered by the fact that my contact lenses kept sliding every time I focused too hard.

That baseline run mattered because it set the stage for PRIME.

PRIME is Neurable’s neurofeedback system, and it’s best described as a personalized meditation warm-up for your brain. Instead of asking you to "clear your mind" in the abstract, PRIME visualizes your focus and cognitive load in real time. As you relax and concentrate, dots on the screen slowly collapse into a single point — feedback that your brain is entering an optimal state.

Alicia Howell-Munson, the research scientist who developed PRIME, described it less as a relaxation exercise and more as cognitive tuning. The session lasted just over a minute for me. Others, I was told, can take anywhere from 30 seconds to several minutes, depending on fatigue, stress, or how far off their baseline they are that day.

When it ended, I felt oddly alert. Not wired, but ready. The best comparison I can make is the feeling right after a good meditation session, except with a clearer sense of purpose. Unfortunately, my contacts were still drifting.

Credit: Chance Townsend / Mashable

After PRIME, I retook the same Aimlabs test. Despite my eyes fighting me, the numbers improved. I hit more targets, and my reaction time dropped noticeably, from roughly 500 milliseconds down into the mid-450s.

That aligns with what Neurable claims to have observed in early testing. According to the company, everyday gamers and esports athletes using PRIME showed average reaction time improvements of around 40 milliseconds, along with gains in accuracy and target hits. In competitive contexts, those margins matter.

The feeling afterward was quite pleasant, all things considered. Everything on screen felt slightly slower, but I was reacting more quickly. Alcaide described it as "bullet time for your brain," which sounds corny until you experience it.

While still a proof of concept, the team has high expectations to have this headset on the market in the very near future.

Head to the Mashable CES 2026 hub for the latest news and live updates from the biggest show in tech, where Mashable journalists are reporting live.

Categories: IT General, Technology

eXTReMe Tracker