Blogroll
Will there be an Alien: Earth Season 2? Heres what we know.
FX's Alien: Earth quickly became one of the most obsession-worthy shows of the summer
Over the course of eight episodes, Noah Hawley's Alien prequel series gifted viewers with bold new tech, a host of terrifying creatures, and a downright nausea-inducing mini Alien movie. With the show's Season 1 finale looming, can fans expect more Alien: Earth in a Season 2?
SEE ALSO: 'Alien: Earth's game-changing ending, explained Will there be an Alien: Earth Season 2?Yes! On Nov. 11, FX announced that Alien: Earth has been renewed for a second season. FX also revealed that Hawley had signed an overall deal with FX and Disney Entertainment Television, which bodes well for the future of Alien: Earth.
SEE ALSO: Five burning questions we have for 'Alien: Earth' Season 2That's a future that Hawley has been thinking about for a long time. In a 2025 interview with Variety, Hawley mentioned his hopes for further seasons. "Season 1 is the proof of concept," Hawley said. "And if it works commercially, then Season 2 is about building a model upon which we can envision making a Season 3, 4, 5."
Don’t miss out on our latest stories: Add Mashable as a trusted news source in Google.
Hawley is also already thinking about the wait time between seasons, telling Evolution of Horror that he has "a destination in mind, story-wise" and that his main question going forward is, "How streamlined can we make the process so that you're not waiting for three or four or five years for more?"
When would Alien: Earth Season 2 air?FX has yet to set a release date for Alien: Earth Season 2. However, the renewal announcement stated that Season 2 will begin filming in London in 2026. From this, we can make an educated guess based on the turnaround time on Season 1.
Season 1 of Alien: Earth entered production in July of 2023, although it began filming without its American SAG-AFTRA members due to the Hollywood strikes. (British cast members were able to film as they were not part of SAG-AFTRA.) Production paused later due to the strike, before resuming in April of 2024 and then wrapping in July of 2024. The show premiered in Aug. of 2025, making for a two-year turnaround from the beginning of production to release. That means we might be waiting until 2028 for a Season 2.
However, production on an Alien: Earth Season 2 will not have to contend with another SAG-AFTRA strike, meaning it will hopefully take less time for Wendy (Sydney Chandler) and her Xenomorph to grace our screens again.
Alien: Earth Season 1 is now streaming on Hulu.
UPDATE: Nov. 12, 2025, 9:30 a.m. EST This article was originally published on Sept. 23, 2025. It has been updated to reflect the news of Alien: Earth's renewal.
Streaming ruined movies in more ways than you realize
Look anywhere online, and you'll see discussions about how streaming services are ruining the cinema experience and negatively impacting movie theaters. And while I agree with many of those points, there's more to it than that. This new generation of endless content available to stream at a moment's notice ruined movies and TV in more ways than you realize.
I self-host my own private ChatGPT with this tool
ChatGPT has become the poster child for artificial intelligence and large language models everywhere, but if you want something more specialized, or you want something you can guarantee is private, it isn't your only option.
Meta AI chief Yann LeCun reportedly leaving company
It seems like there's always a shakeup going down in AI, with Meta seeing more shakeups than most. The company recently laid off 600 people within its AI unit.
Now, Yann LeCun, a chief AI scientist at Meta and professor at New York University, is planning to leave the company and build his own startup, anonymous sources told the Financial Times. Meta did not immediately respond to a request for comment from Mashable.
The report says that LeCun, who won the prestigious A.M. Turing Award for breakthroughs in AI, will be leaving in the coming months to pursue work on a startup that focuses on his own world models. He is already working on raising capital for the startup, according to the Financial Times.
LeCun wouldn't be the first to focus on world models, which, according to TechCrunch, are AI systems that develop "an internal understanding of its environment so it can simulate cause-and-effect scenarios to predict outcomes." World Lab, Google DeepMind, and Nvidia are all also developing world models.
This comes at a time when Meta is heavily inserting AI into users' feeds and is working frantically to keep up with AI rivals such as OpenAI, Google, and Anthropic.
The northern lights are potentially visible tonight. Where and how to see them.
The northern lights wowed sky gazers across much of the U.S. last night when they made an exciting appearance in states as far south as Florida. But if you missed out last night, don't worry, the skies are set to be glowing again tonight as another blast of geomagnetic energy is coming, according to a prediction from the Space Weather Prediction Center.
Wrap up warm, pick a dark, secluded spot, and keep on reading to see when and how you can spot the northern lights tonight.
When are the northern lights?Tonight, Nov. 12, the northern lights are returning to North American skies. The Space Weather Prediction Center is predicting a "severe" geomagnetic storm, meaning the chances of there being northern lights are pretty high.
The geomagnetic activity is measured by the planetary K index, or Kp. It ranges from 0 to 9, with higher numbers indicating stronger activity and a greater chance of seeing the aurora across the United States. Tonight’s activity is forecast to reach a Kp of 8, meaning the northern lights could move far from the poles and appear bright and active, even across northern parts of the U.S. According to these stats, the northern lights should peak at 10 p.m. ET.
Where will the northern lights be visible?According to maps from the Space Weather Prediction Center, the northern states are more likely to see the northern lights tonight. However, social media posts from last night have revealed people as far south as Florida managed to catch a glimpse of them.
This Tweet is currently unavailable. It might be loading or has been removed.These maps display the view line cutting roughly through Washington, Minnesota, Wisconsin, New York, and up to Maine. Anywhere above this line has a very high chance of spotting the northern lights, but this is approximate, not a strict boundary. And don't count yourself out if you live a little further south, with a Kp of 8, you might just get lucky...
This Tweet is currently unavailable. It might be loading or has been removed. What causes the northern lights?The aurora borealis, commonly known as the northern lights, is a natural light phenomenon visible in the night sky, most often near the polar regions. NASA explains that the display happens when charged solar particles interact with Earth’s atmosphere, producing glowing streaks of green, pink, and purple that ripple and move overhead.
Viewing tips for the northern lightsCatching a glimpse of the northern lights can be a stroke of luck. There are so many elements at play, including your local forecast and cloud coverage, but there are some things you can do to help. As ever with these events, travel away from light pollution as much as you can, and don't forget to give your eyes time to adjust. The Space Weather Prediction Center also suggests the best time to see the aurora is usually between 10 p.m. and 2 a.m. local time, when displays are at their most active.
How to use Python on an Android phone, iPhone, or iPad (with a Raspberry Pi)
There are ways to use the Python programming language on a typical Android device, iPhone, or iPad, but with fewer features available than Python on most desktop computers. Thankfully, you can still get the full Python experience on those devices—you just need a Raspberry Pi or other low-power server on the same network.
How to secure your data, connection, and wallet: Travel tips
People are traveling more and more. According to UN Tourism (World Tourism Barometer), in 2024, more than 1.4 billion people traveled abroad, almost as many as before the pandemic. Those who are vacationing to relax and freelancers are among them. However, with the revival of tourism, the risks of falling prey to fraudsters while traveling have made a comeback, too.
NIntendo Direct: Super Mario Galaxy movie trailer and cast revealed
Nintendo held a Nintendo Direct livestream...of sorts on Wednesday.
No, there weren't any trailers for new Switch 2 games or anything like that. It was only a few minutes long and was dedicated solely to the upcoming film The Super Mario Galaxy Movie. The sequel to 2023's blockbuster hit The Super Mario Bros. Movie is taking Mario and friends to space, just as the game Super Mario Galaxy did in 2007. The trailer is just a quick peek at the movie, but it looks a lot like the game it's inspired by.
SEE ALSO: Yoshi's design in 'Super Mario Galaxy' may have leaked thanks to a box of cookiesThat's not all, though. Nintendo also revealed two key new cast members for the movie. Playing the role of galactic protector, Princess Rosalina is Academy Award winner Brie Larson, who, perhaps not coincidentally, is also a noted fan of Nintendo games. (She also has quite a bit of experience playing a galactic protector in the form of Captain Marvel.) That one makes sense.
On the funnier end is Uncut Gems co-director Benny Safdie as Bowser Jr., the movie's primary villain. Safdie has also worked as an actor, sure, but the idea of anyone at Nintendo watching and enjoying Uncut Gems is bringing me a lot of joy this morning. Safdie most recently appeared in Oppenheimer, Are You There God? It's Me, Margaret., and Happy Gilmore 2.
Behold, the itty bitty baby Bowser in all his glory. Credit: Universal and Illumination PicturesThe Super Mario Galaxy Movie comes out in theaters on April 3.
2026 Hyundai Palisade Hybrid blends power, efficiency, and comfort
The 2026 Hyundai Palisade takes the brand’s largest SUV to the next level with a hybrid setup that boosts both performance and efficiency. It’s not just about horsepower or fuel numbers, but about a ride that feels smoother, quieter, and more responsive in everyday driving.
Good Luck, Have Fun, Dont Die trailer makes an apocalyptic villain of AI
If the title for Good Luck, Have Fun, Don't Die didn't give you an idea of the bonkers energy Gore Verbinski's upcoming film is channelling, the trailer will lock it in.
In the video above, the Pirates of the Caribbean director sends you into a Los Angeles diner on a rainy night, where Sam Rockwell turns up in a strange (and leaking?) get-up claiming to be from the future. His warning? Technology and AI are just the tip of the iceberg when it comes to our impending doom. As weird things begin happening and birthday parties are ruined, we're in for a mission to save the world with an all-star cast including Zazie Beetz, Michael Peña, Juno Temple, Haley Lu Richardson, Asim Chandhry, and Tom Taylor.
Written by Matthew Robinson (The Invention of Lying), Good Luck, Have Fun, Don't Die kind of looks like a film for people currently obsessing over apocalypses and possibly Apple TV+'s Pluribus (that's us, we're people).
Good Luck, Have Fun, Don't Die hits cinemas Feb. 13.
Lollipop—the Android update that changed everything
Although Android still receives major updates every year, they’re nowhere near as transformative as those from the early days of the OS. One of the biggest milestones was Android Lollipop, which began rolling out via OTA updates on November 12, 2014. Let’s take a look at everything that made this update so special.
The Daily Show brutally roasts Trumps plans for a 50-year mortgage
In a Truth Social post and an interview with Fox News, Donald Trump has introduced the idea of a 50-year mortgage in the U.S, suggesting it might make home ownership more affordable for Americans. But Daily Show host Josh Johnson isn't convinced.
In the clip above from Tuesday's show, the host pulls up a graphic calculating that a six percent mortgage on a $400,000 home would mean roughly $300 a month less in repayments versus a 30-year mortgage, but over $800,000 in interest over the lifetime of the loan.
"So you're saying that after interest, a $400,000 mortgage is going to cost me $1.3 million?" says Johnson. "That is the opposite of affordability. This man is creating generational debt. They're going to be fighting to get out of grandma's Will. Grandkids will be like, 'I barely knew her, alright? I wouldn't even hug her at Christmas because her skin was too loose."
These beginner-friendly tools saved me from formatting Word docs by hand
With so many menus to dig through, getting a document to look decent in Microsoft Word can feel like a chore. I used to find formatting tedious, but after learning a few features and shortcuts, it became effortless. If formatting feels like a struggle, here’s how to do it faster and with less effort.
MMCTAgent: Enabling multimodal reasoning over large video and image collections
Modern multimodal AI models can recognize objects, describe scenes, and answer questions about images and short video clips, but they struggle with long-form and large-scale visual data, where real-world reasoning requires moving beyond object recognition and short-clip analysis.
Real-world reasoning increasingly involves analyzing long-form video content, where context spans minutes or hours, far beyond the context limits of most models. It also entails querying across massive multimodal libraries of videos, images, and transcripts, where finding and integrating relevant evidence requires more than retrieval—it requires strategic reasoning. Existing models typically perform single-pass inference, producing one-shot answers. This limits their ability to handle tasks that require temporal reasoning, cross-modal grounding, and iterative refinement.
MMCTAgentTo meet these challenges, we developed the Multi-modal Critical Thinking Agent, or MMCTAgent, for structured reasoning over long-form video and image data, available on GitHub (opens in new tab) and featured on Azure AI Foundry Labs (opens in new tab).
Built on AutoGen, Microsoft’s open-source multi-agent system, MMCTAgent provides multimodal question-answering with a Planner–Critic architecture. This design enables planning, reflection, and tool-based reasoning, bridging perception and deliberation in multimodal tasks. It links language, vision, and temporal understanding, transforming static multimodal tasks into dynamic reasoning workflows.
Unlike conventional models that produce one-shot answers, MMCTAgent has modality-specific agents, including ImageAgent and VideoAgent, which include tools like get_relevant_query_frames() or object_detection-tool(). These agents perform deliberate, iterative reasoning—selecting the right tools for each modality, evaluating intermediate results, and refining conclusions through a Critic loop. This enables MMCTAgent to analyze complex queries across long videos and large image libraries with explainability, extensibility, and scalability.
MMCTAgent on Azure AI Foundry LabsSpotlight: Microsoft research newsletter
Microsoft Research Newsletter Subscribe today Opens in a new tab How MMCTAgent worksMMCTAgent integrates two coordinated agents, Planner and Critic, orchestrated through AutoGen. The Planner agent decomposes a user query, identifies the appropriate reasoning tools, performs multimodal operations, and drafts a preliminary answer. The Critic agent reviews the Planner’s reasoning chain, validates evidence alignment, and refines or revises the response for factual accuracy and consistency.
This iterative reasoning loop enables MMCTAgent to improve its answers through structured self-evaluation—bringing reflection into AI reasoning. A key strength of MMCTAgent lies in its modular extensibility. Developers can easily integrate new, domain-specific tools—such as medical image analyzers, industrial inspection models, or specialized retrieval modules—by adding them to ImageQnATools or VideoQnATools. This design makes MMCTAgent adaptable across domains.
VideoAgent: From ingestion to long-form multimodal reasoning Figure 1. MMCTAgent’s Planner–Critic architecture enables multimodal reasoning over long-form video through structured ingestion, retrieval, and iterative feedbackThe VideoAgent extends this architecture to long-form video reasoning. It operates in two connected phases: library creation (ingestion) and query-time reasoning.
Phase 1 – Video ingestion and library creationBefore reasoning, long-form videos undergo an ingestion pipeline that aligns multimodal information for retrieval and understanding:
- Transcription and translation: Converts audio to text and, if multilingual, translates transcripts into a consistent language
- Key-frame identification: Extracts representative frames marking major visual or scene changes
- Semantic chunking and chapter generation: Combines transcript segments and visual summaries into coherent, semantically segmented chapters with associated key frames. Inspired by Microsoft’s Deep Video Discovery agentic search tool, this step also extracts detailed descriptions of objects, on-screen text, and characters present within each video segment, integrating these insights directly into the corresponding chapters.
- Multimodal embedding creation: Generates image embeddings for key frames, linking them to their corresponding transcript and chapter data
All structured metadata, including transcripts, visual summaries, chapters, and embeddings, is indexed in the Multimodal Knowledgebase using Azure AI Search (opens in new tab), which forms the foundation for scalable semantic retrieval and downstream reasoning.
Phase 2 – Video question answering and reasoningWhen a user submits a query, the VideoAgent retrieves, analyzes, and reasons across the indexed video content using specialized planner and critic tools.
Planner tools- get_video_analysis: Finds the most relevant video, provides a summary, and lists detected objects
- get_context: Retrieves contextual information and relevant chapters from the Azure AI Search index
- get_relevant_frames: Selects key frames most relevant to the user query
- query_frame: Performs detailed visual and textual reasoning over selected frames
- get_context and get_relevant_frames work in tandem to ensure that reasoning begins from the most semantically relevant evidence
- critic_tool: Evaluates the reasoning output for temporal alignment, factual accuracy, and coherence between visual and textual modalities
This two-phase design, which involves structured ingestion followed by agentic reasoning, enables MMCTAgent to deliver accurate, interpretable insights for long information-dense videos.
ImageAgent: Structured reasoning for static visualsWhile the VideoAgent handles temporal reasoning across long-form videos, the ImageAgent applies the same Planner–Critic paradigm to static visual analysis. It performs modular, tool-based reasoning over images, combining perception tools for recognition, detection, and optical character recognition with language-based reasoning for interpretation and explanation.
Planner tools- vit_tool: Leverages Vision Transformer (ViT) or Vision Languague Model (VLM) for high-level visual understanding and description
- recog_tool: Performs scene, face, and object recognition
- object_detection_tool: Localizes and labels entities within an image
- ocr_tool: Extracts embedded text from visual elements
- critic_tool: Validates the Planner’s conclusions for factual alignment and consistency, refining the final response
This lightweight ImageAgent provides fine-grained, explainable reasoning over image collections—supporting visual question answering, content inspection, and multimodal retrieval—while maintaining architectural symmetry with the VideoAgent.
Evaluation ResultsTo assess the effectiveness of MMCTAgent, we evaluated both the ImageAgent and VideoAgent with multiple base LLM models and a range of benchmark datasets and real-world scenarios. Some key results are presented here.
Image DatasetsGPT-4VMMCT with GPT-4VGPT4oMMCT with GPT-4oGPT-5MMCT with GPT-5MM-Vet [1]60.2074.2477.9879.3680.5181.65MMMU [2]56.8063.5769.1073.0084.2085.44 Video DatasetsGPT4oMMCT with GPT-4oVideoMME [3]72.1076.70MMCTAgent enhances base model performance by augmenting their capabilities with appropriate tools such as object detection and optical character recognition (OCR) for weaker models, or domain-specific tools for stronger models, thereby leading to substantial improvements. For example, integrating these tools raised GPT-4V’s accuracy from 60.20% to 74.24% on MM-Vet dataset. Additionally, the configurable Critic agent provides additional validation, which is especially valuable in critical domains. The additional evaluation results are available here (opens in new tab).
Takeaways and next stepsMMCTAgent demonstrates a scalable agentic approach to multimodal reasoning with a Planner–Critic architecture. Its unified multimodal design supports both image and video pipelines, while the extensible toolchain enables rapid integration of domain-specific tools and capabilities. It provides Azure-native deployment and supports configurability within the broader open-source ecosystem.
Looking ahead, we aim to improve efficiency and adaptability in retrieval and reasoning workflows, and to extend MMCTAgent’s applications beyond current agricultural evaluations, exploring new real-world domains through initiatives like Project Gecko to advance the creation of accessible, innovative multimodal applications for people around the globe.
AcknowledgementsWe would like to thank our team members for their valuable contributions to this work: Aman Patkar, Ogbemi Ekwejunor-Etchie, Somnath Kumar, Soumya De, and Yash Gadhia.
References
[1] W. Yu, Z. Yang, L. Li, J. Wang, K. Lin, Z. Liu, X. Wang, and L. Wang. “MM-VET: Evaluating large multimodal models for integrated capabilities”, 2023.
[2] X. Yue, Y. Ni, K. Zhang, T. Zheng, R. Liu, G. Zhang, S. Stevens, D. Jiang, W. Ren, Y. Sun, C. Wei, B. Yu, R. Yuan, R. Sun, M. Yin, B. Zheng, Z. Yang, Y. Liu, W. Huang, H. Sun, Y. Su, and W. Chen. “MMMU: A massive multi-discipline multimodal understanding and reasoning benchmark for expert AGI”, 2023.
[3] Chaoyou Fu, Yuhan Dai, Yondong Luo, Lei Li, Shuhuai Ren, Renrui Zhang, Zihan Wang, Chenyu Zhou, Yunhang Shen, Mengdan Zhang, et al. “Video-MME: The first-ever comprehensive evaluation benchmark of multi-modal llms in video analysis”, 2024.
Opens in a new tabThe post MMCTAgent: Enabling multimodal reasoning over large video and image collections appeared first on Microsoft Research.
These 5 simple Linux tools make Windows 11 look outdated
I am mainly a Linux user, but sometimes I have to return to Windows for work or gaming. Whenever I do, I notice that Windows hasn’t evolved in any meaningful way since I first used it. Compared to Linux, it feels like it’s stuck in time. Let me explain what I mean.
The Dyson HushJet is at its record-low price at Amazon — act fast to save $50
SAVE $50: As of Nov. 12, the Dyson HushJet is on sale for $299.99 at Amazon. That's a 14% discount on the list price.
Opens in a new window Credit: Dyson Dyson HushJet $299.99 at Amazon$349.99 Save $50 Get Deal
If you're someone who suffers from allergies, has a pet, or just really hates the thought of lingering dust, an air purifier will become your best friend. They're handy devices that keep the area around you feeling clean and clear, as well as eliminating any unwanted odors. And as of Nov. 12, you can get one of the most popular models on sale at Amazon.
Right now you can get the Dyson HushJet on sale for its lowest-ever price. At Amazon you'll find the HushJet reduced to $299.99, a saving of $50 on list price.
SEE ALSO: The best vacuums we've tested at home, from robots to Dyson stick vacsThe Dyson HushJet is a compact but powerful air purifier that's perfect if you don't want a big, clunky device taking up space. And despite its size, it can purify up to 203 square feet. It has a fully sealed system which means any pollutants or allergens it collects won't escape. It also has a 360° Electrostatic filter that captures 99.97% of particles as small as 0.3 microns.
One of the best qualities of this device is its noise level. Operating at just 24dB in Sleep mode, it will deliver purified air quietly through a star-shaped nozzle that minimizes turbulence. If you can't focus or sleep with background noise, this purifier won't bother you.
Everything is controlled through the MyDyson app, so it's easy to monitor and control air quality from anywhere. You can even set schedules, adjust modes, and view personalized air quality reports.
This is a limited-time deal according to Amazon, so act fast so you don't miss out.
Jimmy Kimmel pays heartbreaking tribute to his best friend and band leader Cleto Escobedo III
On Tuesday, Jimmy Kimmel took to Instagram to announce that Cleto Escobedo III, the leader of his show's band and his childhood best friend, had passed away at the age of 59. In the monologue above, Kimmel pays an incredibly moving tribute to the man he's known since his family moved to Las Vegas when he was nine years old, and who he ended up working with on Jimmy Kimmel Live! for 22 years.
Over the course of the 22-minute monologue above, Kimmel describes his childhood friendship with Escobedo, explaining that Escobedo and his father both worked as professional saxophone players before they formed the shows's band when it launched back in 2003.
"I'm heartbroken to lose him. I'm going to take yet another lesson from him and acknowledge how lucky I was to have him literally at my side for so many years," says a tearful Kimmel. "Cherish your friends. We're not here forever."
Score a free $250 Amazon Gift Card when you sign up for Prime Visa
FREE GIFT CARD: Sign up for the Prime Visa card to get a free $250 Amazon Gift Card to spend over Black Friday.
Opens in a new window Credit: Amazon $250 Amazon Gift Card with Prime Visa Card Learn MoreAhead of Black Friday, Amazon has been dropping some stellar early deals for shoppers. If you're keen on taking advantage of some of the deals already available at the retailer and need a new way to save cash, Amazon is offering a limited-time deal that gets you a $250 Amazon Gift Card when you sign up for the Prime Visa.
This deal can be accessed from the Prime Visa application page, which also gives you the chance to check to see if you're pre-qualified. The good news is you don't have to wait long after approval for the gift card — you'll get it instantly upon approval. It's worth noting that this card is only available to those who don't already have an Amazon Visa or Prime Visa credit card.
SEE ALSO: Target Black Friday ad: The best deals you can buy online earlyOnce your application has been approved, you'll find the gift card loaded into your Amazon account. Another important thing to keep in mind is that the gift card cannot be used for other gift card purchases, but there's still a lot to shop at the retailer right now.
With this extra gift card cash, you can put it towards a wide variety of items at Amazon that have already dropped in price. To see some of the items that are already on sale ahead of Black Friday, check out our roundup of the 40+ best early Black Friday deals.
Treat yourself to something nice over Black Friday. Don't miss out on Amazon's limited-time offer of a $250 Amazon Gift Card when you sign up for the Prime Visa. And if you're curious when the official sale event begins this year at Amazon, we've also gathered up everything you need to know about the retailer's plans in our breakdown.
Incogni Data Removal Service Review: A hands-off way to protect your data
I’m not the only person concerned about my data being exploited in the hands of data-broker sites. Incogni seeks to give users an easy way to remove said data from over 420 public and private data brokers. I found the service useful, but not without its weak points.
The best vacuums weve tested at home, from robots to Dyson stick vacs
The best vacuum cleaner setup is subjective, heavily hinging on how badly you want to do the vacuuming yourself or if you'd rather have a reliable robot vacuum cleaner vacuum for you.
SEE ALSO: As an anxious cat mom, I love my robot vacuum with a livestream cameraIf your vacuum is a pain to clean with, you won't feel like using it at all — whether it's an upright vacuum that's too clunky to get out of the closet or a robot vacuum that gets stuck more often than not. Regardless of the type of vacuum you choose, there's still a ton of nuance between brands and models in both of those categories. Your best bet is to get some recommendations from someone who uses both every day (like me).
Black Friday vacuum deals are going to pop offYou're lucky to be doing your vacuum shopping in November — vacuum cleaner deals are a pillar of Black Friday. If you were previously ruling out a more premium Dyson vacuum or a robot vacuum and mop combo with self-washing mopping pads, this might be your chance to grab one at a less scary price. In fact, every single robot and cordless vacuum in this guide has already been on sale at some point in 2025. They're bound to beat their own sale prices this month.
After several years of pitting the top robot and stick vacuums against each other in my own home, I've pulled a few of each to create Mashable's ultimate mashup of all of the best vacuum cleaners of 2025.
Should I get an upright or robot vacuum?I could argue for both sides. Ideally, I'd recommend shopping strategically and splitting your budget between one of each type of vacuum. But if you're only choosing one, consider what your main priority for this new vacuum is. Is your ideal vacuum cleaner one that automates the most steps to give you as little work as possible, or is it one that does the absolute most meticulous job, even if you still have to do the work?
Trusting a robot vacuum to do this chore comes with the fine print that the robot vacuum is probably going to piss you off sometimes. Autonomy-related features like smart mapping, small obstacle avoidance, automatic floor type recognition, and debris level sensors have gotten quite perceptive over the past few years. But even the smartest robot vacuums I've tested are still inanimate objects, not humans with real brains.
The obvious downside of an upright vacuum is that you have to have enough free time to use it, and be OK with spending some of that free time off the couch. Granted, the better the cordless vacuum is at cleaning, the less you'll have to get it out, and the less of a pain it'll be to whip around. But at the end of the day, if you detest getting your vacuum out, you probably won't vacuum too often — and that defeats the purpose of spending your money on a nice, new vacuum in the first place.
Vacuums I'm testing soonThe next vacuums slated for at-home testing are the Shark Stratos 2-in-1 NeverTouch, the Roborock H60 Hub Ultra, and eventually, the Dyson Spot+Scrub Ai. I'll be getting my hands on these as soon as possible and updating my top picks for pet owners accordingly.


