Blogroll

The best VPN for Kodi

Mashable - Tue, 03/26/2024 - 10:47

This content originally appeared on Mashable for a US audience and has been adapted for the UK audience.

If you’re a movie fan or enjoy nothing more than binge-watching box sets, chances are you've heard of Kodi. Kodi is a free, open-source software media player app formerly known as XBMB that acts as a hub for all your music, movies, and TV content, which can be shared across all of your devices.

Benefits include being able to turn your gadgets into a part of a digital streaming hub, accessing all sorts of file formats and searching for content in one place, watching geo-blocked content, and adding lots of customisable features via add ons such as iPlayer, which are needed for streaming content. Of course, there are drawbacks too, but most of these can be circumnavigated with a VPN, which encrypts data leaving a device and makes it impossible for others to see what you’re downloading.

What is a VPN?

VPNs are security tools that provide protection for your information by creating a private network that hides your real IP address. All of your activity is untraceable and secure, because all of your online traffic passes through an encrypted tunnel. Nobody can see into the tunnel, and everything inside the tunnel is protected against online threats like hackers, viruses, and malware.

The act of hiding your real IP address is what can trick leading streaming sites into thinking you are based in another country.

Do you need a VPN for Kodi?

Having a VPN is important when using Kodi because the media player is popular and open source, so there are inevitably security flaws exploited by hackers. These are predominantly found within third party add-ons — some of which offer pirated material — rather than those found in the Official Kodi Addon Repository. VPNs assign a virtual IP address to obscure your real location from others, protecting you from security scams, viruses, and malware. For example, a VPN can stop malware infecting your home network from a Kodi box and add an extra layer of protection to your computer or laptop by securing the connection.

Using a VPN also enables a Kodi user to become anonymous, preventing hackers from gathering personal data needed for an add-on and assigning a different IP address that’s untraceable, keeping you safe and keeping your viewing habits secret. This obviously means there’s a benefit for when it comes to watching geo-unblocked and pirated content (which we obviously don't condone).

Whatever you choose to use Kodi for, a VPN will enhance its features while offering much-needed security. 

What is the best VPN for Kodi?

When assessing VPN providers, we really like NordVPN for Kodi because of its unbeatable security and no logging policy. One of the largest VPN services around, NordVPN has user-friendly apps for iOS, Android, Mac OS, and more — so all of your devices can be covered. We're also fans of ExpressVPN, which we found to be the easiest VPN to use for accessing geo-blocked streaming services like Netflix and Amazon Prime. Let's face it — Kodi isn't the only way you experience your entertainment and if you're going to invest in a VPN for Kodi, it makes sense that it works well with streaming services too. We're all about options.

Whether you’re looking for high speeds or an easy interface, we've lined up all of your best options.

These are the best VPNs for Kodi in 2024.

Categories: IT General, Technology

I used the Apple Vision Pro on a flight to Costa Rica — and it was chaotic

Mashable - Tue, 03/26/2024 - 10:00

"The Apple Vision Pro is out-friggin’-standing, but what the hell am I going to use it for?"

This is what many tech journalists and reviewers with early access to the $3,499 swanky headset said. They gushed about the accurate eye tracking, the nuanced hand tracking, and slick visionOS interface, but scratched their head over how it’d fit into their daily lives.

I thought I knew the answer to their conundrum: travel. As a frequent jetsetter, I often see the AirPods Max or AirPods Pro as one of the most popular travel accessories, allowing people to drown out the annoying drone of a plane. And I was certain that, in a few years, the Vision Pro could be the next big thing for travelers.

SEE ALSO: 5 reasons Apple Vision Pro will be the hottest new travel accessory

I put my hypothesis to the test, taking the Apple Vision Pro along with me on a JetBlue flight from JFK airport in New York to Juan Santamaría International Airport in Costa Rica. As it turns out, I may have spoken a little too soon about the Apple Vision Pro being the next breakout travel accessory.

Apple Vision Pro price

The Apple Vision Pro is often touted as being a spatial computing headset with a $3,499 price tag, but with taxes included, it set me back $3,800 via Apple's website.

Fortunately, shipping is free. (Hey, it’s the least Apple can do, right?)

The configuration I got comes with the following specs:

  • 256GB of storage

  • Two 3,660 x 3,200-pixel micro-OLED displays with up to a 120Hz refresh rate

  • 16GB of RAM

  • M2 chip with 10-core GPU and the R1 co-processor

  • 12 cameras and five sensors

You can increase storage to 512GB and 1TB, but that’ll cost you $3,700 and $3,900, respectively — and that’s before the taxes kick in.

If I had the wherewithal, I’d grab the 512GB or 1TB configuration because I plan to store countless videos, including spatial content, over the course of my time with it.

Opens in a new window Credit: Kimberly Gedeon / Mashable Apple Vision Pro $3,499.00 at Apple
Shop Now What I like about Apple Vision Pro

So why did I bring my Vision Pro with me on my flight to Costa Rica? Apple boasted that the headset can bring a private cinematic experience to users, allowing them to expand the virtual display to a gigantic, floating screen.

Tired of the tiny, seatback screens, I thought the Vision Pro would rectify this issue, finally bringing a massive display bright before my eyes without disturbing other passengers.

Download shows for a theater-like experience

I subscribed to Apple TV and Max (formerly HBO Max) to download shows before my flight so that I could access them on the plane. Even without WiFi, I watched several episodes of Euphoria. I also watched the first half of Dune and a few minutes of Ex Machina, one of my favorite sci-fi movies of all time.

My Vision Pro offline downloads set to be played on my flight Credit: Kimberly Gedeon / Mashable

All of them played without a hitch, and unlike at the theater or on a laptop, I can make the virtual display as massive or as tiny as I want. Plus, I can toggle on captions, too. To top it all off, the quality, from the rich colors to the sharp imagery, is as crisp as seltzer water.

Easy to carry around

I’m sorry, but there’s no way in hell that I’m going to spend an additional $200 on a travel case. Instead, I had another pouch lying around my house that I used to store my Apple Vision Pro.

Carrying the Apple Vision Pro around my neck Credit: Kimberly Gedeon / Mashable

However, I barely used the makeshift travel case. Most of the time, the Vision Pro was hanging around my neck, giving myself easy access to the headset.

The Vision Pro has an awkward ski mask-like design. The “mask-esque” portion consists of a laminated-glass front plate. It also features a stretchy, knitted rear headband. Once you put it on your head, you can tighten it with a circular dial on the right side of the device.

The Vision Pro is adjustable via a dial on the side of the device. Credit: Kimberly Gedeon / Mashable

As you'll find out in a later section, I'm not the biggest fan of the design — "comfortable" and "Vision Pro" do not belong in the same sentence.

Seamless TSA experience

Similar to your tablet and laptop, you’ll have to take the Apple Vision Pro out of your carry-on or personal bag before placing it on the conveyor belt for security scanning.

The Apple Vision Pro is, in fact, TSA-compliant. Credit: Kimberly Gedeon / Mashable

I placed my M2 MacBook Air and the Vision Pro in one tray and the headset came out the other end without being flagged as a threat. Nice!

‘Travel mode’ is a must

Before heading to the airport, I’d advise turning “travel mode” on.

Travel mode provides stabilization while flying. Credit: Kimberly Gedeon / Mashable

If not, while you’re walking — and even while you’re zooming through the air at 500 miles per hour — your apps will fly past you erratically. With travel mode off, windows would disappear from view, hiding in a far corner or under my chair. When my significant other spotted me frantically looking for something — and I looked extra silly with the Vision Pro strapped on — he said, “What are you looking for?” I said, “I can’t find the home apps! I think they flew under my seat.”

Looking for the apps that hid under my seat. Credit: Kimberly Gedeon / Mashable

Realizing how ridiculous I sounded, we both laughed. That’s why you need travel mode. With it on, it pins your windows right in front of you so they don’t move, holding them in place.

Picks up subtle hand tracking

As the only person on the whole entire plane with an Apple Vision Pro, I’m not gonna lie, I was super self conscious about using it. I could just feel people staring daggers at me — and sniggering behind my back.

I felt a little ridiculous as the only person wearing the Vision Pro on my flight. Credit: Kimberly Gedeon / Mashable

Fortunately, the Vision Pro’s hand tracking is elite. For example, in order to make selections inside the headset, you need to use a pinching gesture. Even the most subtle pinches — one where I keep my hands low and close to my body so no one can see any conspicuous movements — the Vision Pro picked up on them with ease. It was super impressive!

And that’s on top of the incredible eye tracking. Being able to simply “tell” the headset what you want to click on by simply looking at it is so Black Mirror — and I’m here for it.

‘Mindfulness’ app for turbulence

On my way back from Costa Rica to New York, my JetBlue flight suffered from some turbulence while flying over Florida.

Via Giphy

Sitting inside a rattling winged, metal thingamajig isn’t my idea of fun, so I pulled up the Mindfulness app and launched a meditation session. While being shaken and stirred inside the plane, I found my own slice of peace for five minutes while following breathing exercises with an expanding and contracting spherical virtual artifact.

Featured Video For You Apple Vision Pro Tech & Specs What’s ‘eh’ about the Apple Vision Pro

Although I have plenty of positive things to say about the Apple Vision Pro, it still needs a lot of work. Let’s start with some things that make me go “eh” – features that aren’t necessarily dealbreakers, but they’re irksome enough to be noteworthy.

Battery pack requires you to wear the right clothing

If you’re going to drag the Vision Pro with you on a plane, you must wear something with pockets. Fortunately, I wore an athleisure two-piece set with pockets, allowing me to place the Vision Pro’s cumbersome battery pack inside my right pocket.

You're going to need somewhere to put the battery pack. Credit: Kimberly Gedeon / Mashable

You must make sure the pocket is snug, too, or else it will fall out frequently. Of course, the battery pack is a bit of a nuisance, but it fit nicely into my pocket, so I forgot it was even there.

Streaming with airplane WiFi is OK

Fortunately for me, JetBlue offered free WiFi. As such, I connected to the “FlyFi” network before takeoff. I streamed Apple TV movies including Killers of the Flower Moon and Napoleon, allowing me to watch films that I didn’t get to download beforehand.

I had a better screen than the seatback one provided on my flight. Credit: Kimberly Gedeon / Mashable

This is no fault of Apple, but as you can imagine, airplane WiFi is iffy. Consequently, the quality dropped significantly. I’d say visual content was streaming somewhere between 480p and 720p. On the plus side, I didn’t experience any signal dropping while watching movies — playback was seamless and consistent. However, your mileage may vary. Flying over the ocean, for example, can impact the plane’s WiFi.

‘Sharing’ is a pain

Sometimes, I’d stumble upon a funny or wild scene while watching a movie inside the Vision Pro and I wanted my significant other to have a laugh, too. 

Trying to show my partner a funny scene proved annoyingly difficult. Credit: Kimberly Gedeon / Mashable

I wished I could “freeze” the content where it stands and have my partner continue where I paused so that he, too, can chuckle along with me. However, passing on my headset to my partner was a pain. He had to ask me for my password and re-navigate to Apple TV app himself before resuming the episode I was streaming.

There is a “guest mode” you can turn on, but it’s still such a hassle. Perhaps in a future update, Apple will allow us to add “trusted users” to the headset so that they don’t need to go through so many steps before seeing the content I want them to see.

Some audio bleed, but Vision Pro works with AirPods

If you turn down the Vision Pro volume, it can be low enough that it won’t bother your neighbors nor alarm flight attendants, but just loud enough for your content to be audible. The sound, by the way, is crisp, sharp and full — à la the divine audio that comes out of my M2 MacBook Air.

If that doesn’t sit right with you, you can pair your AirPods to the Vision Pro — similar to how you’d pair one to your MacBook — allowing you to listen to Vision Pro content at any volume setting.

It’s embarrassing to wear

Coined by The Verge, I became a “glasshole” — someone who ignores social decorum and walks around with a silly headset all day. Some people approached me with curiosity, asking me what’s hanging around my neck — even a U.S. Customs and Border Protection officer inquired about the headset.

I definitely garnered some curious — and judgmental — looks. Credit: Kimberly Gedeon / Mashable

Others, however, stared aggressively. Some even sniggered behind my back, which is fair — I’d probably do the same if a Vision Pro-wearing passenger was grasping at the air to make in-headset selections.

Because wearing a headset is so uncommon and the Vision Pro is still in its “early adopter” phase, I felt like an alien wearing this around the airport. As someone who is already self-conscious as it is simply wearing plain clothes, strapping a headset to my face heightened my insecurity.

It will take some time before wearing an AR/VR headset becomes normalized.

The Vision Pro isn’t heavy, but it’s not comfy either

Apple touted the Vision Pro as a travel companion, but it still needs some tweaking before it becomes the next big thing for frequent fliers.

Credit: Kimberly Gedeon / Mashable

Early users of the Vision Pro, including tech journalists and influencers, said that the Vision Pro felt “heavy,” but I don’t think that’s the right word. The Vision Pro weighs about 1.3 pounds. When I pick it up with my hands, it feels quite lightweight.

Wearing the Vision Pro for too long became uncomfortable. Credit: Kimberly Gedeon / Mashable

The problem is, however, that the weight distribution of the Vision Pro is off. All of the components are frontloaded, which means I felt a lot of tugging and tightness around my eyes. I constantly needed to readjust the headset to reduce that “pulling down” feeling I was sensing on my face.

‘Tracking failed’ error messages

While flying to and from Costa Rica, I took nighttime flights, which meant that the cabin was quite dim. Yes, the Vision Pro was still usable, even in low-light situations. It’s also quite helpful that I left the seatback monitor on, so the glare of the screen helped to add more illumination.

Tracking was spotty without adequate lighting. Credit: Kimberly Gedeon / Mashable

However, if I were to look down at my feet — away from the monitor’s glare — I’d get a “tracking failed” error message, and the virtual windows disappeared. When I popped my head back up, the content resumed without any issue.

Missing popular dedicated apps

I have Netflix, Hulu, and YouTube subscriptions. Unfortunately, none of these apps have a dedicated presence on the Vision Pro (although you can access them through Safari).

As such, to properly test the Vision Pro, I purchased two additional subscriptions: Apple TV and Max. After all, these two have apps tailor-made for the Vision Pro and I wanted to see how well they performed. As aforementioned, Apple TV and Max work well on the Vision Pro, but make sure to download your favorite shows and films before your flight.

Material gets dirty easily

Yes, the knit material, particularly the stretch headband, is breathable.

The knit head strap is comfortable, but prone to dirt. Credit: Kimberly Gedeon / Mashable

But it’s definitely prone to getting dirty. You’ll definitely have to wash it often.

Eye strain

One thing I never quite understood about companies like Meta and Apple making a push for spatial computing is the fact that they haven't addressed eye strain.

SEE ALSO: The XREAL Air 2 smart glasses are a huge improvement from the original model

Airports can be tiring, from queuing at the TSA checkpoint to waiting for delayed planes.

By the time you get on the plane, you're exhausted. The moment I put the Vision Pro on my eyes, which have two high-tech displays sitting right before my peepers, I thought to myself, “I can’t have this on my face for more than 20 minutes!"

Apple Vision Pro battery life

I watched Wonka from start to finish and half of Priscilla on the Apple Vision Pro. With some Apple Arcade gaming sprinkled in (e.g., Jetpack Joyride), the Apple Vision Pro lasted three hours and 34 minutes on a single charge. (This battery life test was not conducted on the plane.)

This is much better than I expected. For reference, the Meta Quest 3 lasted one hour and 19 minutes.

SEE ALSO: The best laptops for battery life, according to our expert tests Is the Apple Vision Pro worth it?

For Apple’s first foray into the AR/VR space, the Vision Pro is a best-in-class headset. Early reviewers were spot on when they gushed about the eye and hand tracking — it’s smooth, seamless, and sensational.

Credit: Kimberly Gedeon / Mashable

And I can understand why the Vision Pro costs an arm and a leg. At its core, the Vision Pro is a computer you can strap onto your face. If people are spending over $2,000 for laptops, it’s not too hard to grasp why you’re shelling out $3,500 for a nascent, wearable spatial computing device (which, of course, includes the “Apple tax”).

However, this review focuses on the Apple Vision Pro as a travel companion. And as it stands now, it’s not quite ready for primetime. While the Vision Pro can, indeed, deliver a cinematic experience for travelers as Apple promised, the Vision Pro simply isn't comfortable enough.

It’s also worth considering the Meta Quest 3, which is $3,000 cheaper.

Opens in a new window Credit: Meta Meta Quest 3 $499.00 at Amazon
Shop Now
Categories: IT General, Technology

Save an extra 20% on this Luminar Neo bundle

Mashable - Tue, 03/26/2024 - 10:00

TL;DR: Through April 2, you can get a lifetime subscription to the award-winning Luminar Neo app, along with six add-ons, for only $159.99 with the code GET20.

AI often gets flak among creatives because, a lot of the time, the output it churns out is either uninspired or ventures into weird territory, which is fair. But AI can be a powerful tool in enhancing existing projects and does a great job in making work not feel like work. For instance, Luminar Neo is an award-winning app that simplifies complex photo editing jobs, making your post-processing setup much easier.

With just a few clicks, Luminar Neo can turn any work into a masterpiece. Through April 2, you can grab it on sale, along with a bunch of nifty add-ons, on sale for $159.99 with the code GET20.

A Red Dot Award 2022 recipient for Interface Design, Luminar Neo harnesses AI to make short work of photo editing. If you're a creative professional, you must know firsthand how laborious it is to tweak photos to make them match your vision, but with this app's AI-driven tools, editing becomes less time-consuming. It does everything from replacing skies and enhancing landscapes and portraits to removing unwanted objects and adjusting every aspect of light. If you're working with multiple photos at once, you can use the panorama stitching feature to patch them together and create panoramas.

Luminar Neo's latest iteration even comes with new nifty features, including focus stacking, pixel-perfect upscaling, seamless background removal, and HDR merging. For large-scale projects, you can also take advantage of multiform presents, so you can create a consistent look across all your photos.

A total of six add-ons are included in this bundle, each of which serves a specific purpose. The Tender Blushing Skies include various cloud shapes and textures, Tranquil Dawn features overlays of stunning sunsets and sunrise, and Frosty Winter comes with LUTs that emphasize the cold winter vibe. Meanwhile, Light Reflections packs light effects of varying directions and intensity, Color Harmony has vibrant color presets, and Wintertime Overlay features an array of atmospheric textures.

Supercharge your editing process with this Luminar Neo lifetime bundle. Normally $752, you can grab it on sale for just $159.99 with the code GET20 until April 2, 11:59 p.m. PT.

StackSocial prices subject to change.

Opens in a new window Credit: Luminar Neo The Award-Winning Luminar Neo Lifetime Bundle $159.99 at the Mashable Shop
$752.00 Save $592.01 with code GET20 Get Deal
Categories: IT General, Technology

These noise-cancelling wireless earbuds are on sale for $56

Mashable - Tue, 03/26/2024 - 10:00

TL;DR: Through April 2, these JBL Tune Buds Active Noise-Canceling Earbuds are on sale for $59.99 (reg. $99.95).

Our worlds are typically pretty hectic. Between work and home, there's a lot happening to pull your focus. Limit the distractions of the outside world with noise-canceling earbuds like these JBL ones. For a limited time, they are on sale for just $59.99 (reg. $99.95).

Tune out all the banter when you use these earbuds in active noise-canceling mode. Listen to lo-fi music at your desk while you work to be more productive. Or use it to study at home while the rest of the family is free to live their lives.

These JBLs also feature Smart Ambient technology, which allows you to switch from noise-cancel mode so you can tune in to the surrounding sounds. This can be helpful if you're out for a walk with them in your ears, especially if it's a road with vehicular traffic.

Connect to your devices wirelessly using Bluetooth 5.3, and control your entire earbud experience using the JBL Headphone app. It also has four-mic tech, allowing you to answer calls without missing a beat.

Offering a long battery life supported by the included charging case, they're also water-resistant, so you can use them for workouts or walks to work in the rain.

What does open box mean? This is considered a new, open-box item, which is typically excess inventory from stores that might have been in contact with customers. You may see some signs of extra handling or random stickers.

Join the freedom that comes with wireless listening when you get these JBL Tune Buds Active Noise-Canceling Earbuds on sale for $59.99 (reg. $99.95), but only for a limited time.

StackSocial prices subject to change.

Opens in a new window Credit: JBL JBL Tune Buds Active Noise Cancelling Earbuds (Open Box) $59.99 at the Mashable Shop
$99.95 Save $39.96 Get Deal
Categories: IT General, Technology

Digitize your scans for just $32 with this advanced app

Mashable - Tue, 03/26/2024 - 10:00

TL;DR: Through April 2, you can score a lifetime subscription to iScanner for only $31.99 with coupon code GET20.

The list of what your smartphone cannot do seems to be getting shorter by the day. It can take photos and videos, tell you how to get somewhere, measure stuff, monitor your heart rate, help you talk to a foreigner, notify you if you need to bring your umbrella to work the next day, and plenty more. Depending on what type of vehicle and phone you have, it can even start your car. And if you're all about embracing modernity, you can use your smartphone to digitize your entire life. Well, almost.

If you want to make paper a thing of the past, digitize documents, receipts, photos, and more, with iScanner. Your phone may already have a default scanning tool, but iScanner takes it up a notch by helping you organize every scan and even do some light editing on them. Through April 2, you can grab a lifetime subscription to the app for only $31.99 with code GET20.

This iOS app can help you digitize virtually anything. It scans everything from contracts and tax forms to tickets and receipts to the handwritten love letter from your childhood sweetheart you managed to excavate from the depths of your old closet. All scanned documents are exportable in various formats, too, including PDF, JPG, DOC, XLS, PPT, or TXT.

If a certain document looks a little worse for wear, iScanner's AI-powered editing tools can come to the rescue. You can correct the color, remove noise, adjust borders, straighten pages, and more. You can also mark up documents and conceal parts that need hiding, add text and watermark, insert signatures, and autofill them using custom templates.

Organization-wise, iScanner has a built-in file manager complete with folders for easy document management, and you can sort your stuff accordingly with the drag-and-drop tool. And for top secret files, you can lock them with a custom PIN.

On top of scanning, the app can also assist you with complex math problems. Plus, with its advanced scanning capabilities, you can use it to measure object length and calculate its area, as well as count similar objects.

Ready to go fully digital? Formerly $199, grab a lifetime subscription to iScanner for only $31.99 with the coupon code GET20 until April 2, 11:59 p.m. PT.

StackSocial prices subject to change.

Opens in a new window Credit: BP Mobile iScanner App: Lifetime Subscription $31.99 at the Mashable Shop
$199.90 Save $167.91 with coupon code GET20 Get Deal
Categories: IT General, Technology

Pay only $70 for this new-to-you HP Chromebook

Mashable - Tue, 03/26/2024 - 10:00

TL;DR: Through April 2, get a new computer without breaking the bank with a refurbished 11.6-inch HP Chromebook on sale for only $69.99.

Planning a tech upgrade often makes you perform mental gymnastics thinking of how you can justify a hefty purchase and whether or not the budget for your new computer would have to be taken out of next month's groceries. But upgrading doesn't always have to put a damper on your finances, especially if you opt to buy refurbished items.

If you're eyeing a simple computer for work or school, this 11.6-inch HP Chromebook could be a worthy investment. The good news? The investment doesn't even have to be that big, with a refurbished unit of this same computer on sale for only $69.99. Yup, it's that affordable.

This budget-friendly Chromebook is great for students and casual users whose tasks only require minimal computing power. Think browsing the web, managing email, browser-based research, document processing, and the occasional YouTube binge. Its Intel N400 processor allows for such basic tasks, with its 11.6-inch screen displaying content in HD quality. It also packs Intel UHD Graphics 600, allowing it to handle basic graphic design tasks and even some light gaming.

The computer runs on Chrome OS, boasting quick boot times, a streamlined and intuitive interface, and seamless integration with Google services you likely already use on a regular basis, including Google Drive, Gmail, Google Docs, Google Meet, and many more. This laptop also happens to be the Education Edition, so it has features tailored for educational use, like management software for teachers and educational programs for students. Plus, thanks to its long battery life, you can trust it to run a full day.

Since this unit is refurbished, it should be noted that it has a grade "B" rating, meaning it may have minor scuffing or marks on the surface. But despite these minute scratches, the quality remains uncompromised.

Don't overspend on tech. Regularly $80, grab this refurbished HP Chromebook on sale for just $69.99.

StackSocial prices subject to change.

Opens in a new window Credit: HP HP 11.6-inch Chromebook G7 EE N4000 4GB RAM 16GB Storage (Refurbished) $69.99 at the Mashable Shop
$80.00 Save $10.01 Get Deal
Categories: IT General, Technology

Emotional support platform 7 Cups beset by trolls

Mashable - Tue, 03/26/2024 - 10:00

Psychologist Glen Moriarty founded the emotional support platform 7 Cups in 2013 as a way to help people listen to each other's concerns, particularly when they had nowhere else to turn. Users are free to be as vulnerable as they wish, provided they obey the platform's community guidelines and terms of service. 

The platform, which users can join at no cost, may seem like the perfect solution to both the loneliness epidemic and the broken American mental health care system, which is expensive and hard to access. 

But for some users, 7 Cups comes with its own high cost: trolling and abusive behavior. 

A months-long investigation into 7 Cups found that the platform sometimes struggles to contain and address problems with users who act inappropriately, poorly, or aggressively, or even threaten other users. In the past, such abuse has included discussion of sexual acts and fetishes as well as comments directing another user to kill themselves. Mashable found that teens may be targeted by predators.

This story is part of our investigation into the emotional support platform 7 Cups and the growing marketplace for apps and platforms that pair people with someone who is supposed to be a compassionate listener. The series explores a failed experiment between the state of California and 7 Cups, as well as the myriad risks of seeking emotional support online from strangers. These dangers can include the manipulation of vulnerable youth and targeted abuse and harassment. The series also includes an analysis of why it's so hard to stop online child exploitation, and looks at solutions to make platforms safer.

High-level current and former staff and volunteers who spoke to Mashable anonymously because they didn't want to violate a nondisclosure agreement they signed say 7 Cups' approach to punishing those who violate the platform's rules can be surprising or confusing. Users, for example, have been encouraged by Moriarty himself to help rehabilitate "trolls" who behave poorly. 

Moriarty denied that trolling was pervasive on the platform and noted that the company has taken steps over the last decade to improve user safety. 

"We are constantly solving problems, getting stronger, and continue to hold true to our core mission of helping the community," Moriarty told Mashable in an email. 

Trolls have existed on 7 Cups for years

Moriarty has long known about bad actors on 7 Cups, because he's personally been subject to their unwelcome behavior. 

In June 2020, a few months into pandemic isolation, he dedicated a forum post on 7 Cups to the subject of "people who are trolling." He noted that his own experience on the platform "has included all types of trolling," including what he described as "sexual trolling," wherein "the person is trying to engage with you — sneakily — in a sexual manner." 

Moriarty's advice to 7 Cups users about how to handle trolling was largely unconventional: He encouraged victims of abuse and harassment to attempt to persuade the other user to change their behavior. 

His sample script included empathetic statements like, "I know that life has likely been challenging for you…I think that is partly why you are behaving towards me like you are right now."

The post elicited dozens of responses, including from users who'd been harassed. 

One commenter challenged Moriarty's conviction that the platform's bad actors could be rehabilitated. They wrote: "[W]hat about the troll that keeps telling me to eat poison soup and go to the grave though?" 

Another listener chimed in: "I'm lucky my troll finally decided to leave me alone. I say that because I don't feel that Cups did enough to protect me as a listener from the vile filth spewed forth in my inbox." 

When asked about these remarks, Moriarty noted that other commenters reported "positive interventions they utilized in response to people trolling."

He told Mashable that "we take numerous steps to address and stop trolling behavior," including auto-detection of ​​abusive activity and the use of blocking, muting, and reporting tools. He also said that the company has been developing a tool powered by artificial intelligence that can scan and identify messages that violate the platform's terms of service and guidelines in one-on-one and group chats. 

"Our expectation is that this will make circumventing our existing safety processes and guidelines very, very difficult," Moriarty noted.

Whitney Phillips, assistant professor of digital platforms and ethics at the University of Oregon, reviewed a copy of Moriarty's 2020 post and the comments. She characterized Moriarty's approach to trolling behavior as harmful to users. 

Phillips, author of This Is Why We Can't Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture, said it's a common misconception that people always troll because they're wounded and act out for attention. 

Instead, the behavior is often game-like. They derive joy and pleasure out of finding ways to make someone feel uncomfortable. They're not desperate for validation and often can't be deterred by appeals to a better self, said Phillips. 

She also warned against asking triggered or traumatized users to rehabilitate their abuser, a request she described as "cruel." The responsibility of holding trolls accountable, and protecting victims, should rest with 7 Cups, Phillips said. 

"To offer this advice, it's mismatched with the kinds of behaviors that are clearly chronicled in the comments," she added. 

Multiple listeners secretly trolled other users

Moriarty's discussion of trolling also didn't reveal a discovery that 7 Cups staff said they made years ago: Some of the platform's highly rated listeners had alternate secret accounts they used to harass or bully other users. Moriarty denied this and said the behavior violated the platform's terms of service. Former staff said they stumbled across the problem when attempting to identify the platform's best listeners. 

7 Cups had already deployed an algebraic formula to determine trust and reputation "scores" for listeners, which helped identify trolling accounts, as well as users demonstrating good behavior. 

It wasn't long before staff noticed inappropriate, trolling, or bullying accounts registered to the same email address of highly rated listeners, or other telling links between such accounts. 

"You couldn't just say, 'This person's great and you can trust them all the time,'" a former staff member said. 

The severity of the trolling problem led 7 Cups to control the demo environment by using hand-picked listeners who wouldn't sink the company's chances of landing a lucrative deal by engaging in offensive or abusive behavior, according to multiple sources who worked for the platform over the last several years, a claim that Moriarty also denied. 

Phillips said she's unsurprised that people engaging in trolling behavior have conflicting personas and accounts. Trolling actually requires good listening skills, according to Phillips' research. Such users must pay close attention to someone's vulnerabilities. But those who engage in trolling also possess the social skills to weaponize those vulnerabilities. 

Phillips believes it's generally a mistake to simply observe people's online behavior and assume their actions are sincere, but especially in a digital environment premised on helping others. Instead, there's the real possibility that people on emotional support platforms may be bored or even mean. 

"People play in dark directions and light directions and lots of directions in between," she said. "They do all kinds of things for all kinds of reasons that don't fit into any clear-cut box, particularly one that takes sincerity as the default mode of human expression."

Dealing with trolling on 7 Cups

One infamous user has wreaked havoc on the platform by bullying, abusing, and threatening other users since at least 2019. They've been given opportunities to improve and rehabilitate their behavior, which Moriarty acknowledged occurred years ago as an attempt to coach the user to behave in more "prosocial ways." 

When they've violated those expectations and been banned, they've found ways to create burner accounts at a pace that 7 Cups staff has not been able to effectively counter. Moriarty said that when moderators recognize the user or their behavior, they are banned in under a minute or faster.

Recently, 7 Cups began requiring listeners to verify their phone number, which can more closely tie a user's identity to their behavior, if they use a real number. Those who want to avoid detection can easily obtain a throwaway number from various online services. Moriarty said members would soon have to go through phone verification.

In order to deter abusive behavior and set expectations, 7 Cups uses a points system for listeners and members, but some of the punishments can seem surprisingly unclear or lenient. An adult-teen listener who gives their contact information to a teen but doesn't initiate off-site contact isn't permanently banned from the platform but is instead given three behavioral points as a consequence, for example. Initiating off-site contact with a teen is a four-point offense. 

Both violations result in a warning, a break from the platform, and removal of the user's adult-teen listener badge and access to the teen community. While this is unclear based on the points system chart, Moriarty said that any adult-teen listener who behaves this way is put on a months-long break. They also lose their badge and cannot regain it in the future. Ten or more points leads to suspension from the platform, but points can also expire six months after they are accrued. Moriarty told Mashable that the points system is similar to how points on a driver's license works.

When users are caught, violations that lead to immediate removal from 7 Cups include spamming forums with inappropriate content or ads; posting inappropriate or graphic pictures; repeatedly sexting; and being underage or signing up for the wrong age community.

A former high-level volunteer who left the platform in 2020 said that its rules were unevenly applied. Newer members who committed more serious infractions were often bounced from the platform, but established listeners with a good trust score and a rapport with moderators might be given dispensation. 

"If there is not consistent enforcement of the rules, it creates a permission structure for anything to happen," said Phillips. 

Moriarty noted the difficulty of knowing exactly what happened in each situation that involved a violation of the rules.

"Not all cases are black and white," he told Mashable. "I imagine there have been uncertain issues, vague situations, or competing explanations where it could be interpreted as dispensation, but likely not significant." 

Multiple sources who've worked or volunteered at 7 Cups stressed that they've tried to elevate safety issues and solutions over the years, with limited success. They felt that costly initiatives or efforts that might negatively affect growth but improve safety were ignored or rejected by senior management or Moriarty. He told Mashable that this characterization was inaccurate.  

Though 7 Cups employs blocking and reporting tools, as well as the ability to ban users, those strategies are stretched when bad actors try repeatedly, and doggedly, to regain access to the platform by creating a new anonymous account. Currently, when a member is temporarily removed or banned from 7 Cups, it can be easy to make a new account using a quickly generated burner email address and a new fake persona.    

Security tools to stop trolls can be bypassed

Sources familiar with 7 Cups' security protocol say the site attempts to prevent bad actors from creating multiple burner accounts by tracking users' internet protocol (IP) addresses. Yet this tactic is rendered useless if someone accesses the internet through a virtual private network, which can conceal their digital identity. 

Additionally, an IP address is an imprecise tracking tool as it can be assigned not to a user's device, but to the coffee shop they frequent or their dorm building. Banning a user based on that information might unintentionally ban dozens or hundreds of other people using that address. 

An IP fingerprint, a more specific set of data that can be tied to an individual device, can help narrow the search. Yet, it's also an imperfect solution given that sophisticated bad actors can use technology to mimic or hijack the identity of a different device.  

As a result, sources say that for years the platform's moderators have played whack-a-mole trying to catch users who've been banned for various infractions but quickly return with a new account. 

John Baird, cofounder and CEO of the identity verification company Vouched, told Mashable that while an IP address and device ID can help identify bad actors, they shouldn't be the sole way to verify an identity and block a user from accessing a platform. Vouched, for example, uses visual evidence, algorithmic evaluation, geo-location data, and device-related information, among other strategies, to verify identity and vet an individual's risk to an organization. 

"Security is always multiple factors stacked on top of one another to be able to catch the bad guy," said Baird. "The challenge is, if it's a single factor, the bad guys will figure out a way around that single factor."

Moriarty told Mashable he was confident that new technology solutions, like the AI-powered speech detection tool, would be "more effective at scale than anything else has been to date." 

He also acknowledged that 7 Cups may have fallen short despite its efforts: "We understand that we are far from perfect, but have worked hard and continue to work hard on this issue." 

Still, the evolution of security on 7 Cups has arguably taken a toll on its members and moderators.

Last summer, a user made multiple accounts on the platform and told people to kill or harm themselves. Mashable viewed evidence of the incident and its fallout. 

Separately, the user who has frequently engaged in abusive behavior over the past several years was also creating new accounts to evade bans, exhausting the platform's moderators with their efforts to stay on the site. 

People complained when the user began a new harassment campaign last year, including telling a listener to kill themselves, according to documentation shared with Mashable. 

Of this incident, Moriarty said that censors blocked the language and that the user was removed: "The system worked as designed." 

According to a source familiar with the problem, the user has continued to harass members and vex moderators since then.

If you're feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can reach the 988 Suicide and Crisis Lifeline at 988; the Trans Lifeline at 877-565-8860; or the Trevor Project at 866-488-7386. Text "START" to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email info@nami.org. If you don't like the phone, consider using the 988 Suicide and Crisis Lifeline Chat at crisischat.org. Here is a list of international resources.

Categories: IT General, Technology

Talking to someone online for emotional support may be riskier than you realize

Mashable - Tue, 03/26/2024 - 10:00

At a time when loneliness is a crisis, HearMe is Adam Lippin's calling. He founded the digital platform in 2018 as a place where a user can talk to someone online and "get something off your chest." The platform matches that user with a "peer listener" who's meant to be supportive. Both people can remain anonymous.  

But Lippin eventually learned that not everyone who logs onto a platform like HearMe has a sincere interest in making an emotional connection. In 2022, it became clear that some users were visiting HearMe to play out fantasies that involved sexual language and innuendo, Lippin told Mashable. 

On the other side of those messages were often psychology interns and graduate students in social work who volunteered on the service to fulfill their educational requirements. Lippin hoped the bad actors could be discouraged by responses that reframed or ended the conversation. But that didn't work. 

"It was like whack-a-mole," Lippin said. "It just didn't stop." 

So Lippin made a risky, consequential decision: HearMe stopped offering a free membership. Soon after, the problem largely ceased, Lippin said. 

"I learned a lesson," he said of online emotional support. "It's like anything — it can be used for good and bad."  

Lippin isn't the only founder and CEO to launch a company designed to alleviate loneliness by connecting strangers with each other. Companies like Wisdo Health, Circles, 7 Cups, and HeyPeers aim to fill gaps in a broken mental health care system by offering users the opportunity to talk to someone online. Like Lippin, some founders find their mission complicated by bad actors with other ideas about how to use the platform. 

A months-long Mashable investigation into these emotional support platforms, including the popular free service 7 Cups, found that users may be exposed to moderate or significant risk in their pursuit of consolation and connection. 

This story is part of our investigation into the emotional support platform 7 Cups and the growing marketplace for apps and platforms that pair people with someone who is supposed to be a compassionate listener. The series explores a failed experiment between the state of California and 7 Cups, as well as the myriad risks of seeking emotional support online from strangers. These dangers can include the manipulation of vulnerable youth and targeted abuse and harassment. The series also includes an analysis of why it's so hard to stop online child exploitation, and looks at solutions to make platforms safer.

In one 2018 case, a 42-year-old man posed as a 15-year-old teen on 7 Cups to access the platform's teen community. He manipulated a 14-year-old girl into creating child sex abuse material and was ultimately charged and jailed for the crimes. That same year, 7 Cups won a contract to provide its services to residents of certain California counties, but its contract was cut short in 2019 after safety concerns emerged, among other issues.

In general, risks on emotional support platforms include encountering an anonymous stranger who's well-meaning but ultimately hurtful, or a purposefully cruel bad actor who, for example, tells someone hoping to feel less alone to kill herself. 

While these issues were most egregious on 7 Cups, Mashable tested other platforms in this market, interviewed some of their members, and spoke with their CEOs, and found that 7 Cups' competitors have faced a range of challenges. These startups are under pressure to develop a successful, scalable business model, all while battling bad actors who find ways to circumvent common safety measures. 

It's not unlike what happens every day on the internet, but in this case the victims can be emotionally or psychologically vulnerable people who opened up to a stranger believing they were safe.   

Unlike in formal mental health treatment, there is currently little recourse for those who've been seriously harmed by their conversations on an emotional support platform. The field is largely unregulated, and federal law has traditionally immunized online platforms from liability in many instances when their users are harmed.

Meanwhile, if someone seeks compassion on an emotional support platform but finds predation and abuse instead, it may have lasting damage. 

"I think you have very real risk that somebody would view this as part of being the quote-unquote mental health system, and if they had a bad experience, I can imagine them never engaging in mental health again, or never seeking other types of treatment or support again," said Dr. Matt Mishkind, a researcher who studies technological innovation in behavioral health as deputy director of the University of Colorado's Helen and Arthur E. Johnson Depression Center.  

What is peer support? 

These companies often use the term peer support to describe their services. Most people who hear this probably imagine a reputable in-person or virtual group run by a mental health provider or organization. 

The National Alliance on Mental Illness' peer-to-peer program, for example, brings people coping with mental illness, or their families, together under the supervision of a trained facilitator. Research indicates that these programs may help with recovery.

Less familiar are peer support specialists, a growing workforce of trained individuals who draw on their own lived experience with mental illness or substance use to aid someone in recovery, in a clinical or outpatient setting. 

This type of intervention shows promise in clinical research for people with mental health conditions. Some studies note small to modest improvements in symptom remission and improved quality of life. Last year, Blue Cross and Blue Shield of Minnesota announced that access to peer support specialists would be a covered benefit for certain members beginning in 2024. 

Peer support specialists, however, do not staff all emotional support platforms. HeyPeers does allow certified peer support specialists to offer their services for a fee, and HearMe users may engage with them as well. 

This distinction between peer-to-peer support versus peer services led by trained individuals who adhere to standardized peer-practice guidelines is important. Someone who downloads an app marketed as offering peer support may not, in fact, talk to a trained peer professional.

How does peer support work? 

When a person does have a positive experience on an emotional support platform, it can be life changing. 

Mashable interviewed two participants of Circles' facilitated support groups, who said their weekly interactions with other members helped them feel less alone and more prepared to handle emotional challenges. 

That service is separate from Circles' free offering, which allows users to gather in hosted chat rooms, discuss topics like parenting, self-care, and workplace stress, and anonymously direct message each other. 

Once someone has received help on an emotional support platform, they may derive great satisfaction out of extending similar compassion to someone else, in a listener role, according to people who've used different platforms and spoke with Mashable about their experiences. 

Still, there is no high-quality research demonstrating that digital emotional support platforms are as effective as peer support specialists or even computer-based cognitive behavioral therapy treatments.

Some of the past studies on 7 Cups weren't rigorous or large enough to draw any conclusions. Four studies conducted between 2015 and 2018 were largely focused on testing the platform rather than establishing high-quality clinical claims of efficacy. Some of the studies had fewer than 20 participants. Regardless, the company continues to advertise its platform as "research-backed" and "evidence-based," a claim its founder and CEO Glen Moriarty defended to Mashable. He noted that the platform's "self-help guides" and "growth paths" are based on types of therapy shown to be effective, including cognitive behavioral therapy and dialectical behavioral therapy.

Other companies have published their own research. 

Last year, Wisdo Health published a study in JMIR Research, which found that users experienced decreased loneliness and depression symptoms, among other improvements, after the platform. The authors also noted that a randomized controlled trial that compared "peer support" to interventions like cognitive behavioral therapy "would be a valuable contribution to the literature."

"It's an exciting moment to be working in this space because it's graduating to a depth of conversation which I'm not sure that peer support has enjoyed in the past," Wisdo Health founder and CEO Boaz Gaon told Mashable in an interview last year. The company, which was founded in 2018 and claims to have 500,000 users, offers clinical referral services to users who are identified as potentially benefiting from therapy. 

Ryan K. McBain, a policy researcher at the RAND Corporation who has examined the efficacy of peer support specialists in the mental health system, told Mashable in an email that peers seem to be most effective when they meet a minimum set of criteria, receive standardized training, have supportive supervision, and are well-integrated into the overall health system. Emotional support platforms often lack these safeguards and provide minimal training.  

McBain said he doubted that untrained individuals would have the "full set of tools" required to support a client, or user, in the same manner as someone who underwent full peer support specialist certification. While he sees value in empathetic listening, particularly from those with lived mental health experience, he believes emotional support platforms need to be fully transparent about what they are — and what they're not. 

"I am not discounting the possibility that these platforms may prove to be a disruptive innovation over the long-run — but they require regulation, and the government is in a position of playing catch-up," McBain said. 

When talking to someone for free on the internet became a big business 

Though it took time, the isolation of the COVID-19 pandemic, as well as the loneliness epidemic, supercharged the concept of digital peer support as a business proposition.  

Wisdo Health has raised more than $15 million from investors like 23andMe founder Anne Wojcicki and Marius Nacht, an entrepreneur who cofounded the healthtech investment fund aMoon. 

The company describes itself as a "social health" platform, emphasizing that it measures changes in people's perception of their loneliness, among other emotional indicators. Users can access the platform for free, but the majority are sponsored by an employer. 

Circles, a competitor to Wisdo Health, has raised $27 million since its founding. 

Other companies have raised far less money. STIGMA, which folded at the end of 2023, was initially bootstrapped by its founder and CEO Ariana Vargas, a documentary filmmaker. HearMe has raised approximately $2 million. It partners with third parties and offers two subscription tiers; a weekly membership is $7.99 while an annual membership is $69.99. 

HeyPeers generates most of its revenue by hosting and staffing video-based support groups for nonprofits. Independent members can join for free. They can participate in HeyPeers support groups, which are facilitated by certified peer support specialists, for $10 per meeting.  

Both Circles and Wisdo Health have pivoted away from a subscription strategy, focusing on landing contracts with major payers like insurers and employers. In March 2023, Wisdo Health partnered with a nonprofit organization in Colorado to make the platform available to adult residents, with a particular emphasis on reaching Medicaid recipients.

In 2018, 7 Cups received a multimillion-dollar contract from the California Mental Health Services Authority to provide the platform to residents in certain counties, but that project was quietly terminated after safety issues, including abusive and sexually explicit behavior, became a concern, according to sources involved in the initiative who spoke to Mashable.

Balancing growth and safety

Rob Morris, CEO of the youth emotional support platform Koko, incorporated it as a nonprofit in 2020, after originally cofounding it as a for-profit company. The shift was motivated partly by Morris' decision not to sell user data or sell the platform to third parties, like employers or universities.

"I think it's hard to find a business model in this space, particularly if you're reaching underserved individuals or young people, that doesn't create misaligned incentives," he said. "We just couldn't find a business model that made sense ethically for us." 

He noted that platforms under pressure to demonstrate high engagement may hesitate to create robust safeguards. 

"The more moderation you put in place, the more constraints you put in place, the less user engagement or attention you get," he said. 

Recruiting users for emotional support platforms often requires a low bar to entry, like free access to services and anonymity. At the same time, these features can create risky or dangerous conditions on the platform. 

Companies may also find ways to derive additional value from the users themselves. Lippin, CEO of HearMe, told Mashable that one of its business deals involves providing its listening service to nurses at a time when burnout is causing a shortage in the profession. 

HearMe aggregates and anonymizes what the nurses share and relays that to their employer, which wants to identify workplace concerns or complaints that might affect their well-being. Lippin said the terms of service indicated to consumers that their data could be used in this way. 

STIGMA, a platform designed for users to receive support when talking about their mental health, tested sponsored content prior to shutting down at the end of 2023. Vargas, the company's founder and CEO, told Mashable that she didn't want to advertise to users, but instead hoped to present users with content "sponsored by the people who want their brands in front of our member base." The founders of Wisdo Health and Circles both told Mashable that they are opposed to advertising.

Many emotional support platforms rely, in some way, on the free labor of volunteer listeners. 7 Cups has uniquely been reliant on volunteer labor to perform critical tasks since its founding. 

"We deliberately designed the platform with a volunteer emphasis from the very beginning, because that appears to be one of the only ways to scale emotional support," Moriarty told Mashable.  

On Wisdo Health, a user can become a "helper," an unpaid community leadership role, after graduating from a training program made available to highly engaged and helpful users. They receive a helper badge only if they pass a training test and continue to demonstrate high levels of helpfulness to others, as assessed by the platform's algorithm. Helpers are expected to check on a certain number of users each day. Roles above helper are filled by paid staff members. 

HearMe uses a combination of paid and volunteer listeners, including graduate students pursuing a social work degree who need the experience to meet their program's requirements. The company vets graduate students, psychology interns, and certified peer specialists against a list of "excluded" individuals maintained by the Office of the Inspector General at the Department of Health and Human Services. The list comprises individuals who violated certain laws, including by committing patient abuse and health care fraud. 

The amount of training volunteer listeners receive varies widely. 7 Cups requires users to complete an "active listening" course in order to become a listener who takes chats. It also hosts numerous other trainings, but they are optional. Circles members who want to become a "guide" and host their own chat room must apply and, once accepted, receive facilitator training. 

In general, volunteer support and listening is often sold to consumers as a fulfilling way to give back, perhaps not unlike how one might volunteer for a crisis line. Those organizations, however, are typically nonprofits, not startups with the backing of venture capital and an eye toward potentially being acquired.  

Safety challenges on emotional support platforms

Founders of emotional support platforms often share a compelling personal story about why their product is critical at a time when loneliness is surging and mental health is declining. 

Gaon has said that his father's battle with terminal cancer led to the platform's creation. Irad Eichler, founder and CEO of Circles, said that his mother's experience with cancer, and the support he received from friends, prompted him to build a "place for people dealing with any kind of emotional challenge." 

For consumers, the assumption undergirding the concept of an emotional support platform is that people will use access to such a network for good. The reality, however, is far more complicated. 

Eichler is candid about the fact that some people occasionally join the platform with "different motivations, and not with the best intentions," even if the vast majority of interactions are positive or supportive.  

That's why both members and paid staff moderate rooms to make sure discussions are on topic and that conversation is respectful. Eventually, said Eichler, artificial intelligence will police all the rooms on a constant basis and alert the company to bad behavior. Moriarty, of 7 Cups, told Mashable the company was working on deploying a similar solution, including for one-on-one chats.  

Users on both platforms can manually report negative experiences.  

Offenses met with an immediate ban on Circles include violent or inappropriate language, aggressive behavior toward others, noncooperation with group facilitators, and taking over a chatroom against the protest of other users. 

"It's an ongoing challenge," Eichler said of the risk bad actors present to emotional support platforms. "It's not something that you can solve. There's a tension that you will always need to manage. I don't think we will hit the place where Circles will be a 100-percent safe space." 

Eichler was emphatic that safety was a priority, as were the CEOs of Wisdo Health, HeyPeers, HearMe, and 7 Cups. 

Yet each major emotional support platform also employs anonymity, which can create unique risks.  

On 7 Cups, bad actors and predators have taken advantage of anonymity. Abusive behavior on the platform has included sexual and violent language, including directing users to kill themselves, according to former and current staff and volunteers who spoke to Mashable. 

On HeyPeers, which allows teens to join, CEO Vincent Caimano told Mashable that, last year, the platform's staff caught a man appearing to flirt with a teen girl in a chatroom about depression. The room, which had been unmoderated overnight, was open for conversation among anonymous users. When the public exchanges were noticed in the morning, Caimano banned the adult user and staff reached out to the teen about the incident. The company also shut down chat rooms that weren't moderated consistently enough by their host, which means checking in every day and participating in conversation. In general, HeyPeers conducts background checks on its staff and contractors via the service Checkr.

Gaon defended Wisdo Health's use of anonymity. He told Mashable that the company had encountered past situations in which people didn't feel comfortable sharing information with a listener if it could be traced back to them, and that he wanted the platform to cater to both those who want to publicly identify themselves and those who don't.

"If you don't allow anonymity, you're not giving the user control over how open they want to be with their real name and real profile details," he said. Gaon later added that the vast majority of the platform's users join via a sponsor, like an employer, that requires them to verify their membership and identity to join. The remaining users have joined without that level of vetting. 

Koko enforces anonymity, and it does not allow users to message each other directly, even though they routinely ask for the feature, Morris said.

"If we let people continue chatting and DMing with each other, retention and engagement would shoot up a ton, but it's just not what our aim is," he said. "The risk of these longer conversations, people being paired up, is just one we've never taken on."

Dr. Mishkind, a proponent of both high-quality peer support and technological innovation in mental health care, said that he would be hesitant to use any emotional support platform knowing that encounters could end in abuse, harassment, or predation.

"It's a huge risk to everybody associated with it," he said.  

Why consumers aren't protected from harm  

Despite the reality that consumers have painful or harmful experiences on emotional support platforms, the companies may bear no responsibility when this happens.  

Federal law known as Section 230 of the Communications Decency Act has long shielded online platforms from liability when their customers treat each other poorly. Notable exceptions include copyright law, illegal activity, sex trafficking, and child abuse that the company knew about and didn't attempt to stop. 

While Congress has raised the prospect of overhauling Section 230, particularly to improve child safety online, digital platforms can continue to invoke it as a defense against liability. 

At Mashable's request, Ari Ezra Waldman, a professor of law at the University of California, Irvine, reviewed the terms of service for the companies Mashable reported on and found very limited grounds for a lawsuit if a user sought recourse after experiencing harm. 

Waldman noted that this is a common reality of the "platform economy." 

He added that the business model of connecting people to strangers for "quasi mental health support" would be less likely to exist in a "world where platforms were more accountable to their users, and to the bad things that happened to their users." 

The Food and Drug Administration and Federal Trade Commission also do not have a clear or obvious role in regulating or enforcing actions against emotional support platforms. 

Attorney Carrie Goldberg believes accountability may be on the horizon. Last year, she sued the chat platform Omegle on behalf of a teenage girl who'd endured years of horrific digital abuse after being paired with a child predator. 

The case moved forward despite Omegle's efforts to shield itself from liability by citing Section 230. The judge found that Omegle could be held responsible for defective and negligent product design. Omegle settled the case, then shut down

"[T]here's not a culture where investors or founders are necessarily looking at the ways that a product can be abused, because they're going in arrogantly thinking that they're going to be immune from all harms that happen," Goldberg told Mashable. 

When 7 Cups lost its government contract in California, it led to a settlement agreement that prohibited either party from disclosing its existence and terms, unless under specific circumstances, like complying with government law. It's unclear whether the same thing could play out in the future with other emotional support platforms that partner with government agencies, should critical issues arise and lead to a terminated contract.

Mishkind said that companies offering a digital solution to mental health care access should be considered part of the system, and treated as such with clear regulation and rigorous independent evaluation, rather than as outsiders not subject to the same rules as other medical entities.

"I don't think we've quite wrapped our arms around that yet," Mishkind said. "There's this kind of protection around them because they are being seen as disruptors, but…we're all now part of the same system."

If you are a child being sexually exploited online, or you know a child who is being sexually exploited online, or you witnessed exploitation of a child occur online, you can report it to the CyberTipline, which is operated by the National Center for Missing Exploited & Children.

Categories: IT General, Technology

Teens who talk about their mental health on this app may be taking a big risk

Mashable - Tue, 03/26/2024 - 10:00

On July 16, 2018, a 14-year-old Texas girl sent explicit photos of herself to a 42-year-old man named Anthony Joseph Smith. 

Smith, who lived in Butler, Pennsylvania, met the teen online, posing as a 15-year-old boy, and they began messaging frequently. Eventually, he tried to convince the teen to leave her parents and join him in Pennsylvania.

It's an increasingly familiar story. Online enticement and exploitation can happen on nearly any digital or social media platform. But Smith didn't meet his victim on X/Twitter, Instagram, or Discord, platforms where well-known, documented cases of enticement, abuse, and exploitation have occurred

Instead, Smith met the teen on a popular emotional support platform called 7 Cups, which encourages people to chat with someone online about their problems and is free. Some users are grappling with serious mental health issues. 

This story is part of our investigation into the emotional support platform 7 Cups and the growing marketplace for apps and platforms that pair people with someone who is supposed to be a compassionate listener. The series explores a failed experiment between the state of California and 7 Cups, as well as the myriad risks of seeking emotional support online from strangers. These dangers can include the manipulation of vulnerable youth and targeted abuse and harassment. The series also includes an analysis of why it's so hard to stop online child exploitation, and looks at solutions to make platforms safer.

The Texas teen, whose name wasn't released by Pennsylvania authorities because she was a minor at the time, may have thought she was safe on 7 Cups. Teens as young as 13 can join its dedicated teen community. The company permits adults who've been internally vetted to chat with its teen members (Smith was not a vetted adult). Though 7 Cups recommends that minors receive parental permission before joining, it does not verify that, nor does it verify their age and identity.

As Smith proved, adults can lie about their age to gain access to the community. This remains true today; Mashable attempted to make teen accounts using a fake email address, name, and birth date, and was granted instant access. 

When told that Mashable had easily made a fake account to join the teen community, 7 Cups CEO and founder Glen Moriarty said doing so was against the platform's terms of service. He noted that people can sign up for services online using inaccurate information and that 7 Cups employed certain measures, like blocking, reporting, and language detection tools, to help keep users safe. 

Moriarty said he was not informed by law enforcement or the minor's parents about the case in Pennsylvania and disputed that adults preyed on youth on 7 Cups, and that adult users themselves experience persistent safety issues on the platform. 

"[W]e have a thriving community of people," he said in a written response. "If 7 Cups tolerated this behavior, we would not have a thriving community."

While 7 Cups warns members against going off-site together, it still happens, according to multiple sources with current and past knowledge based on high-level involvement with the platform. 7 Cups does attempt to block personal information like an email address when people try to share it while chatting. 

Regardless, Smith eventually lured the teen off-site to other social media and messaging platforms, though he was not successful in his attempts to get her to join him in Pennsylvania. 

"The reality that a young person might go online and seek confidence and support because they don't have it offline, and that relationship being one that is abusive because there is a bad person out there that is targeting kids … that's terrifying," said Melissa Stroebel, vice president of research and insights at Thorn, a nonprofit organization that builds technology to defend children from sexual abuse. 

Emotional support platforms and their inherent risks to minors

Founded in 2013, 7 Cups was one of the first online emotional support platforms. These platforms are typically designed to be spaces where people can anonymously message a "listener" about their worries, stresses, and challenges.

The isolation of the COVID-19 pandemic, as well as the loneliness epidemic, supercharged the concept of digital peer support as a business model. Competitors to 7 Cups like Wisdo Health, Circles, and HearMe argue that their services are a critical tool given the nationwide shortage of mental health professionals and difficulty finding affordable therapy.

Venture capital firms and investors see promise in the model. In the past few years, they've poured more than $40 million into the largely unregulated field of startups, according to news reports and funding announcements made by those companies. 

In 2013, Moriarty successfully pitched the idea for 7 Cups to the famous Silicon Valley startup incubator Y Combinator, which he said still owns 7 percent of the company. Moriarty is also the longtime CEO of the digital learning company Edvance360.

Last year, the Office of the U.S. Surgeon General included Wisdo Health in a list of resources for improving social connection, a clear sign that power brokers take the model seriously. 

But an investigation into 7 Cups, and the emerging market of emotional support platforms, suggests that there are far more risks than the industry and its supporters disclose. These risks have been documented online by alleged, often anonymous, concerned users, but this reporting comprises the most comprehensive account of 7 Cups available to the public.

  • Mashable interviewed six sources with current and past in-depth knowledge of 7 Cups' practices and safety protocols; reviewed the platform's policies; spoke with listeners and users on other platforms; and discussed safety concerns with CEOs of other emotional support platforms. Mashable also investigated why 7 Cups lost a lucrative contract with a California state agency in 2019, and found that safety issues were a factor.

  • The sources who spoke about their experiences with 7 Cups requested anonymity because they feared violating a nondisclosure agreement the company required them to sign.  

  • Mashable found that several high-level current and former 7 Cups staff have long been concerned about the safety of minors and adults on 7 Cups. 

  • Though the platform employs strategies to keep bad actors and predators at bay, some have found ways to evade security measures. Moriarty told Mashable, "Combating people with bad intentions is an arms race. You have to continuously innovate to stay ahead of them."

  • 7 Cups relies on volunteers to perform critical functions, such as moderating chat rooms and facilitating group support sessions, and teens are permitted to volunteer to work on company projects.

  • Volunteer listeners, who receive some mandatory training, are sometimes exposed to unwanted sexual content as well as offensive or bullying messages. The same behavior sometimes surfaces in public forums; users, for example, have been told to kill themselves by bullies or trolls. In both scenarios, 7 Cups attempts to block such speech before another user reads it by using language detection.  

  • Since platforms like 7 Cups use a peer-to-peer approach, they are not necessarily subject to regulation by the U.S. Food and Drug Administration or enforcement by the Federal Trade Commission. Nor are they required to comply with the Health Insurance Portability and Accountability Act for those services. 

While these risks are prominent on 7 Cups, Mashable's reporting found that the industry has not openly addressed or resolved many of the same concerns. 

"It makes you think there really need to be official systems of checks and balances when you have this degree of harm happening to people," said Dr. John Torous, a psychiatrist and director of the digital psychiatry division at Beth Israel Deaconess Medical Center in Boston.

The Texas teen's parents discovered her exchanges and alerted law enforcement, who confirmed that Smith had asked for sexually explicit images and received four. Smith's arrest was first reported by the Pittsburgh Tribune-Review in October 2018. Mashable reviewed Smith's publicly available court records and confirmed the case's details with Robert M. Zanella, Jr., the Butler County assistant district attorney who prosecuted Smith. 

In April 2019, Smith pleaded guilty to one count of corrupting a minor and four counts of coercing a child into creating child sex abuse material. He returned to jail last year after violating his parole by sharing fantasies about an adult woman's young daughter on Facebook Messenger, according to Zanella. The woman reported those exchanges to the Federal Bureau of Investigation. 

Smith's case might be characterized by some as one more instance of a predator weaponizing digital technology to suit their own nefarious aims. But emotional support startups are distinct from other types of technology companies, like gaming and social media platforms, because they specifically invite vulnerable people to seek support from strangers, who may have a range of motivations and intentions. Smith's crimes reveal how unpredictably risky these interactions can be. 

7 Cups of Tea: Talking to people online for free 

The idea for 7 Cups of Tea, as it was originally called, started at psychologist and founder Glen Moriarty's kitchen table, according to 7 Cups for the Searching Soul, a self-published book he co-authored in 2016.   

Moriarty turned to his wife, whom he has described as a therapist, for guidance with a business problem and was grateful for her "close listening." The exchange was a revelation for Moriarty. 

"Her care helped me see the problem in a different light so that I could solve it. It was at this point that the clouds parted, the sun shone through, and I had the insight I had been waiting on," he wrote in 7 Cups for the Searching Soul. "What if, any time you needed it, you could access a person who would listen to you and care about your problem?" 

Moriarty was the first listener on the platform, his wife the second. From the beginning, he struggled to find people to provide the service he was advertising. "I could never get enough listeners," Moriarty told Twitch cofounder Justin Kan in a 2020 podcast interview

The company has always been reliant on volunteers to operate. 

"We deliberately designed the platform with a volunteer emphasis from the very beginning because that appears to be one of the only ways to scale emotional support," Moriarty told Mashable.  

7 Cups relies on unpaid volunteers with little training to fill critical roles

Volunteer listeners on 7 Cups are not held to independent, standardized guidelines, like the National Practice Guidelines for Peer Supporters, though they are required to complete an "active listening" course upon volunteering to listen. They can take additional courses produced by 7 Cups, as well as consult with volunteer mentors identified by staff as having demonstrated strong leadership skills. 

Moriarty described the company's staff as "incredibly lean." Among the platform's listeners, 1,500 have what Moriarty describes as "leadership roles." This means that they take chats from members seeking support as well as volunteer their time on tasks like providing guidance to other listeners, sometimes helping them to process difficult chats, and monitoring forum posts for content that needs to be reviewed by staff. 

Sources who've worked and volunteered for 7 Cups said that dozens of volunteers lead major projects and perform key tasks, including evaluating user safety reports and complaints that are generated by automated safety tools. There is no publicly designated head of trust and safety known to the platform's users. Moriarty told Mashable that "trust and safety is not something we have one person do, but is rather distributed across the team."

Sources familiar with the recruitment of volunteers and the daily tasks involved in unpaid roles say there is little required training but high expectations. 

"You get no money, you get no protection, you get nothing," said one former longtime volunteer, who requested anonymity to discuss their experiences. "They make it pretty clear that they want as much from you as possible, as long as possible." 

Those who've volunteered for the platform said to Mashable they believe in its stated purpose and have derived great satisfaction from extending compassion to someone in need. Moriarty said notes from users, including comments posted in forums, emphasize how much the service has helped them, and even "saved" them. 

For the new 7 Cups user, the promise of healing connection is powerful. But the reality of what happens on the platform is far more complicated.  

Anonymity can compromise teen safety on 7 Cups

Moriarty has championed anonymity as a tool for building trust between users, and this is a common practice on competing emotional support platforms. Ideally, anonymous personas enable people to freely support one another without worrying that the information shared could be used against them publicly. Unless users share their real identity, no one really knows to whom they're talking.  

But anonymity can backfire, too. On 7 Cups, the failure to verify teens' identities is what allowed Smith to go undetected as an adult predator. 

Several of the sources who spoke to Mashable said they were frustrated and distressed over the platform's teen safety issues. Two sources with listening, volunteer, and work experience at the company showed Mashable screenshots of exchanges between the platform's users in an effort to substantiate claims that adult listeners had preyed on teen members, and that teens were aware of and concerned about such behavior. Because the platform is anonymous, Mashable couldn't verify the details of these accounts firsthand with alleged victims.

Four other sources with similar knowledge of 7 Cups said they'd known about concerns related to teen safety. 

Moriarty described the claim of concern over predatory behavior toward teens as "inaccurate." He said the company has only received and complied with 10 law enforcement requests since its founding, and argued that the number was low compared to other social platforms. 

Experts in online child exploitation, however, say that the number of cases investigated by law enforcement may be dwarfed by the actual incidence of predatory behavior, partly because minors may not feel comfortable reporting it.   

Additionally, some predators online seek out emotionally vulnerable minors who they believe they can manipulate into creating child sexual abuse material or other types of traumatic content. An FBI warning issued in September 2023 identified one such group of predators, which is known to target youth between the ages of 8 and 17 who struggle with mental health issues. There is no evidence that the group has infiltrated 7 Cups' teen community.   

Compared to its competitors, 7 Cups is unique in how aggressively it welcomes minors. In a 2018 presentation to California mental health officials, Moriarty said 18- to 25-year-olds were the platform's largest demographic, followed by younger teens. 

Teens must be 13 to join as a member and 15 to volunteer as a listener. When teens seek to chat with a listener, they are either randomly paired with someone and cannot choose between a teen or adult-teen listener, or they can browse the listener directory and make a request of a user. Listener profiles indicate whether they chat only with teens, or with teens and adults, meaning they are an adult who has been vetted by 7 Cups.

For teens who make a general request, not a personal one via the directory, and are paired with an adult-teen listener, it should say that person is an adult following their username, Moriarty said. When Mashable tested the teen chat function, that information was missing for the adult-teen listener, which Moriarty said was a bug and would be quickly fixed. A teen can also determine whether their listener is an adult by hovering over their icon or by clicking out of the chat — which they can then return to — to view the listener's bio page, which may or may not include a specific age.

Upon turning 18, minors can join or age into the adult side of the platform, though some sign up for it anyway before that milestone by creating an adult account with a false birth date, according to those with knowledge of related incidents. 

"In some ways, the easiest thing in the world for 7 Cups to have done at any point would've been just to say, 'Let's not do teens,'" said one source who previously worked at the company and who noted that efforts to connect teens to meaningful emotional support were genuine. 

"Clearly if a 42-year-old can pose as a 15-year-old, you're not vetting the identities of the teens well enough," the individual said. 

Research conducted by Thorn indicates that anonymity can contribute to increased risk-taking. An anonymous persona may embolden youth to interact with others in ways they wouldn't online.

For predators hoping to abuse adolescents and teens, that can create opportunities to isolate, victimize, and "build false relationships" with young users, according to a 2022 Thorn report on online grooming, which surveyed 1,200 children and teens between the ages of 9 and 17. 

One in seven respondents said they've told a virtual contact something they'd never shared with anyone before, a possibility that is far more likely on an emotional support platform like 7 Cups, which invites youth to be vulnerable with strangers. 

"Sadly, bad actors target this same information to groom, exploit, and extort minors," the Thorn report noted.

Recently, a member of 7 Cups' teen community asked leadership to draw awareness to predatory behavior on the platform and what to do when they encounter it, a sentiment that was echoed in a group support chat room. Moriarty said a community manager made a referral to 7 Cups' safety information and bi-weekly safety Internet discussions.

The widespread use of volunteers on 7 Cups has also presented distinct safety challenges for teens. 

Some 7 Cups sources said they heard directly from teen volunteers that they felt unsafe while communicating with adult volunteers, which Moriarty said he had no way to substantiate. They noted that while users are instructed not to go off-site together under any circumstances, volunteers correspond via Google Chat and Meet without dedicated oversight by paid staff. Moriarty confirmed to Mashable that volunteer leaders may use Google communication tools to "collaborate" with other volunteer leaders. 

Based on past incidents, current and past staff and volunteers remain concerned that teens may be targeted for exploitation or grooming in those circumstances. 

Safety protocols don't go far enough

In general, Moriarty said 7 Cups has safety protocols designed to keep anonymous bad actors and predators from contacting minors, but multiple past and current staff members and volunteers told Mashable that they fear those practices aren't robust enough. 

The platform has 87 adult-teen listeners, most of whom are on staff or are high-level volunteers. Only 12 of those listeners have no other affiliation with 7 Cups.

In order to gain access to the teen community as an adult without lying about age, listeners need to have extensive experience on the platform, good reviews, and what 7 Cups refers to as a background check. 

That process involves submitting a state-issued identification to the company, as well as a video conversation with a platform moderator. Additionally, 7 Cups staff search the internet for press coverage of the applicant's name in association with criminal acts, such as sexual assault, and may check to see if their name is in a national database of sex offenders. 

Moriarty said that all applicants must pass a background check by companies that specialize in such research, but those familiar with the process say that hasn't always been the case. Instead, they said that the company previously used free resources like Google and social media to check applicants' personal information.  

Currently, 7 Cups doesn't have a rigorous standard for verifying that identification is real rather than doctored or forged, like using algorithmic assessment technology. Moriarty said the company is exploring the use of sophisticated identity document verification.  

Nor does the company have clear directives for how to handle complaints that involve potentially criminal behavior involving minors that occurs on the platform, aside from instructing staff and users to report concerns through its safety reporting form. A pinned message at the top of each chat instructs users who feel unsafe to visit the platform's "safety & reporting" page, which recommends using blocking, reporting, and muting tools. A brief section on teen safety urges minors to talk to a parent or guardian if they feel unsafe. 

One source with knowledge of the platform's current practices told Mashable that there wasn't widespread staff training on whether and how to escalate such reports to law enforcement. When Mashable asked whether 7 Cups informs a minor's parents when an adult has tried to contact their child, Moriarty called it a good idea and said the platform would be implementing the protocol shortly.

A former high-level 7 Cups volunteer, who also served as an adult-teen listener, said that multiple teen members of the platform approached them with questions about how to deal with uncomfortable interactions with adult listeners. Often, the teen felt something was amiss with the adult's behavior, but they struggled to pinpoint a specific red flag or offense.  

"When you have somebody that you think is empathizing with you and listening to you and finally getting you…you're forming this intense bond and then they say things like, who knows what, you don't want to disappoint them, or break that bond, or lose that relationship, and then somebody pounces," the former volunteer told Mashable. 

Until Mashable contacted Moriarty for comment, the platform hadn't updated its webpage on teen safety since May 2019. He said the company was also reviewing where and how it presented information about reporting unwanted or abusive behavior to make those instructions more accessible.

Safety practices vary widely from platform to platform on the internet, said Lauren Coffren, an executive director of the Exploited Children Division at the National Center for Missing & Exploited Children. That makes it hard for minors, and their caregivers, to understand which policies keep them safest. It may also be an advantage for predators. 

"People who want to be able to exploit those differences or exploit [that] lapse of reporting mechanisms or safety features or tools, they'll certainly be able to find a way," Coffren added. 

What happens when someone is harmed on 7 Cups 

Simply put, there are no dedicated federal agencies that regulate platforms like 7 Cups.

The company's emotional support product falls in a gray regulatory area. And while Moriarty described the platform's peer-based interventions as "medicine" in his interview with Justin Kan, these interactions are not offered by licensed clinicians, nor held to rigorous independent testing or standards. 

Neither the Food and Drug Administration or the Federal Trade Commission would comment specifically on 7 Cups itself. Instead, both agencies pointed Mashable to their regulatory guidelines. The FDA may regulate mobile apps whose software is intended to treat a condition, but that doesn't apply to emotional support. The FTC could potentially enforce laws related to health claims and marketing practices, if they were allegedly deceptive. 

This may leave consumers wondering to whom they can turn if they, or their child, has been harmed on the platform.

Until recently, the law didn't offer much hope, either. Traditionally, 7 Cups might have been considered immune from liability for harm inflicted on their users when they encountered a bad actor on the platform. In the past, courts typically dismissed such lawsuits, citing a federal law known as Section 230 of the Communications Decency Act, passed in 1996. 

The law provides that online platforms cannot be held liable for the negative things that their customers or users do just because they occur on the platform. There are some exceptions, including copyright infringement, illegal actions, child abuse, and sex trafficking. Section 230 protection hinges on whether the company is being sued solely in its role as a publisher of other people's content. Some courts have interpreted this broadly to give platforms immunity from liability when the company's customers experience harm based on the platform's content.

But Section 230, as tech companies have come to know and rely on it for nearly 30 years, may be changing. In a Senate hearing on online child sexual exploitation in January, which featured top tech company executives, key senators called for the law's reform. 

Courts have also allowed recent lawsuits against certain platforms to move forward, dismissing some of the plaintiff's claims to immunity under Section 230. 

One key case is a nationwide lawsuit against major social media companies, including YouTube, TikTok, and Instagram, filed on behalf of young users who were allegedly harmed as a result of using the platforms. Last November, a judge ruled that critical aspects of the suit could move forward, despite the companies' insistence that they were protected by Section 230.

Instead, the judge found that the plaintiffs had alleged the platforms' product design choices led to harm that had nothing to do with the content that users published. For example, the judge ruled that the failure to implement robust verification processes to determine a user's age, effective parental controls and notifications, and "opt in" protective limits on the duration and frequency of use are product design defects for which the companies could potentially be held responsible.

Jennifer Scullion of Seeger Weiss, a firm representing the plaintiffs, told Mashable in an email that all companies "have a responsibility to run their businesses in a way that avoids foreseeable harm."

Scullion said that while emotional support platforms involve a different set of facts and analysis than the case against major social media companies, "the real dividing line is whether the harm is from the content itself or from choices a company makes about how to design their platform and what warnings they give of reasonably foreseeable or known risks of using the platform."

The lawsuit that forced the chat platform Omegle to shut down last year may also hold lessons for 7 Cups. In that case, attorney Carrie Goldberg sued the company on behalf of a teenage girl who, at age 11, had been paired to chat by Omegle with a child sexual abuse predator. He spent the next three years exploiting her, forcing her to make child sexual abuse material for him and others. 

That case also moved forward despite Omegle's attempts to shield itself from liability by citing Section 230. The judge found Omegle could be held responsible for defective and negligent product design. Omegle settled the suit. 

Goldberg, who hadn't heard of 7 Cups prior to speaking with Mashable, said attempting to sue the company for harm experienced by users, particularly those who are minors, would depend on whether their distress was caused by content published by other users on the platform or by the design of the product itself. 

Goldberg expressed concern about 7 Cups' ability to match vulnerable people, including children, with bad actors, noting that such information could easily be used to manipulate or exploit them.

"It's a product that's grooming people to be revealing very intimate details of their life," she said.

If you are a child being sexually exploited online, or you know a child who is being sexually exploited online, or you witnessed exploitation of a child occur online, you can report it to the CyberTipline, which is operated by the National Center for Missing Exploited & Children.

Categories: IT General, Technology

Why online child exploitation is so hard to fight

Mashable - Tue, 03/26/2024 - 10:00

As the recent Congressional hearing on online child sexual exploitation demonstrated, the manipulation and abuse perpetrated by bad actors against vulnerable teens on social and digital media platforms can be devastating. 

Consider a few high-profile cases:  

A 54-year-old man reportedly targeted a 14-year-old girl on Instagram in December 2022, plying her with a gift card after she remarked in her own post that clothing was expensive. The man allegedly drugged and raped the teen multiple times after cultivating an in-person relationship with her, according to Manhattan District Attorney Alvin Bragg. 

A 13-year-old boy from Utah was abducted by an adult male in late 2022, after the man groomed the teen on social media platforms, including Twitter (now branded as X), the teen and his parents reported. The boy returned home after five days, but prosecutors said he'd been repeatedly sexually assaulted. 

Mashable's own investigation into emotional support platforms recently found concerns about teen safety on 7 Cups, a popular app and website where users can seek and offer others compassionate listening. In a 2018 case originally reported by the Pittsburg Tribune-Review, a 42-year-old Butler, Pennsylvania, man lied about his age to gain access to the teen community on 7 Cups. The man posed as a 15-year-old boy and coerced a 14-year-old girl into sending him sexually explicit imagery of herself, crimes to which he pleaded guilty.  

These cases reflect a chilling reality. Predators know how to weaponize social media platforms against youth. While this isn't new, it's an increasingly urgent problem. Screen time surged during the COVID-19 pandemic. Adolescents and teens are in the midst of a mental health crisis, which may prompt them to seek related information on social media and confide in strangers they meet there, too. Some research also suggests that youth are increasingly comfortable with conducting an online romantic relationship with an adult. 

This story is part of our investigation into the emotional support platform 7 Cups and the growing marketplace for apps and platforms that pair people with someone who is supposed to be a compassionate listener. The series explores a failed experiment between the state of California and 7 Cups, as well as the myriad risks of seeking emotional support online from strangers. These dangers can include the manipulation of vulnerable youth and targeted abuse and harassment. The series also includes an analysis of why it's so hard to stop online child exploitation, and looks at solutions to make platforms safer.

Bad actors and predators appear to be capitalizing on these trends. Data collected from the Exploited Children Division at the National Center for Missing & Exploited Children (NCMEC) show an alarming increase in online enticement, or types of predatory behavior designed to exploit a minor. 

While there's no single reason that explains the heightened risk, the largely unrestricted access adult bad actors have had to youth online, in the absence of robust safety measures and meaningful federal regulation, may have both emboldened predators and influenced youth attitudes about the adult behavior they'll encounter online.  

Though youth and their caregivers may think the risk of online exploitation is low, the design of many social media platforms tends to maximize opportunities for predators while leaving youth to fend for their own safety. Online child safety experts argue that platforms should take far more responsibility for ensuring their security, and urge youth to report any abuse or exploitation to a trusted adult or to the authorities.

"I think sometimes the pressure is on for these kids to figure it out for themselves," says Lauren Coffren, an executive director of the Exploited Children Division at NCMEC. 

"This is happening on every platform" 

It doesn't matter where youth spend their time online — a popular platform for adults or a space specifically created for teens — bad actors are targeting them, says Melissa Stroebel, vice president of research and insights at Thorn, a nonprofit organization that builds technology to defend children from sexual abuse. 

"At the end of the day, this is happening on every platform," she notes. 

Popular social media platforms don't effectively verify user age, making it possible for children younger than 18 to sign up for services that may put them at greater risk of coming into contact with adults who intend to exploit them. Similarly, adults can often access gated teen communities by simply lying about their age. 

Safety features, like blocking and reporting, can be hard to access, or never explicitly introduced to minors as a way to protect themselves. Bad actors can evade platform bans by creating new accounts using burner email addresses or phones, because their profile isn't tied to a verified identity.

Protecting one's self as a teen from online exploitation, or safeguarding a child as an adult, can be extraordinarily hard under these circumstances. 

Data collected by NCMEC suggests the problem is worsening. Between 2022 and 2023, NCMEC logged a 132-percent increase in reports related to online enticement. This increase included an emerging trend in which children are financially blackmailed by users who request and receive nude or sexual images of them.

Common tactics that predators use to entice children include lying about their age to appear younger, complimenting a child or connecting with them over mutual interests, engaging in sexual chat, providing an incentive like a gift card or alcohol, offering or sending sexually explicit images of themselves, and asking a child for such material. 

Victimization is never a child's fault, experts say. Nor should youth be expected to constantly guard against the threat of exploitation. Instead, prevention experts say that minors and their caregivers need better tools to manage risk, and that social media companies need to design platforms with youth safety as a key priority. 

Unsafe by design 

Thorn urges platforms to consider child safety from the outset. One best practice for platforms is to have content moderation features and humans who staff trust and safety efforts, plus the knowledge of what exploitation looks like, in order to recognize and report abusive interactions and material internally and to the authorities. 

But that's not enough. Stroebel adds that platforms must also have the capacity to scale those systems as the user base grows. Too often, the systems are implemented well after a product's launch and aren't designed to scale successfully. 

"We end up trying to put Scotch tape over cracks in the dam," says Stroebel. 

Stroebel says it's imperative that there are tools to recognize, report, and remove someone with predatory intent or behavior. 

On an emotional support platform like 7 Cups, which relies heavily on a volunteer labor force, a safety report might be evaluated by a volunteer who receives little training for making a decision about escalating bad behavior to paid staff. 

Other apps may use a combination of artificial intelligence and paid human moderation to review safety reports and still issue confusing decisions, like concluding that a clearly harmful offense doesn't violate their terms of service. Instagram users have anecdotally found it difficult to get the platform to take action against bullying accounts, for example. 

Coffren says the NCMEC CyberTipline, which receives reports of child exploitation, often hears from youth and caregivers that the platform reporting process is more difficult than they expected. Multiple links or clicks take users to different subpages, where they might encounter non-trauma-informed language that's inappropriate for someone who's been exploited online. Sometimes people never hear back from the platform once they've made a report. 

Platforms should reduce the "friction" of using reporting tools, says Coffren. They could even require minors to complete a tutorial about how safety tools work before accessing the platform, she adds. 

Coffren points out that every company gets to make its own decisions about safety practices, which creates a "giant disparity" from platform to platform and makes it difficult for youth and caregivers to know how to reliably protect themselves or their children. 

There is legislation aimed at better protecting youth online. Proposed federal legislation known as the Kids Online Safety Act does not impose age and identity verification but would require online platforms to enable the strongest privacy settings for underage users. It would also mandate a "duty of care" so that social media companies have to prevent and mitigate harms associated with using their product, including suicide, eating disorders, and sexual exploitation. 

The legislation has many backers, including the American Psychological Association and the American Academy of Pediatrics. Yet critics of the bill say it would curtail free speech and discourage marginalized youth, such as LGBTQ+ minors, from learning more about their identity and connecting with other queer and transgender community members online. 

Youth more vulnerable online than adults realize

Youth view online experiences differently than many adults, which is why it's critical to incorporate their overlooked perspectives in policy and design choices, says Stroebel. 

Research on online grooming conducted by Thorn found that sharing nudes is now viewed as normal by a third of teens and that half of minors who'd shared such images did so with someone they only knew online. Slightly more than a third of those respondents said they'd given nudes to someone they believed to be an adult. 

Stroebel says the "stranger danger" catchphrase that Gen X and older millennial parents grew up hearing isn't sufficient as standalone advice for avoiding risky situations online. Instead, youth are accustomed to creating a digital social network comprising friends and acquaintances that they've never met before. For some of them, a stranger is just someone who's not yet their friend, particularly if that unknown contact is a friend of a friend.

On its own, this isn't necessarily risky. But Thorn's research on grooming indicates that youth can be surprisingly open with online-only contacts. One in seven respondents said they've told a virtual contact something they'd never shared with anyone before, according to a 2022 Thorn survey of 1,200 children and teens between the ages of 9 and 17. 

Worryingly, the norms around online romantic interactions and relationships, particularly with adults, appear to have shifted for youth, potentially making them more vulnerable to predation. 

The survey found that a significant proportion of youth thought it was common for kids their age to flirt with adults they'd met online. A quarter of teens believed it was common to flirt with users ages 30 and older. Among 9- to 12-year-olds, one in five felt the same way about romantic interactions with older adults. 

Stroebel says that youth struggle when responding to adult behavior that seems predatory. Many view reporting as more punitive than blocking, which creates an "immediate barrier of defense" but doesn't trigger a platform protocol that ends in confronting or banning the adult user. 

Stroebel says that manipulation plays heavily into the youth's decision when, for instance, the adult tells the teen they misunderstood a comment. 

"Think about how hard it is to recognize manipulation in a way that you trust your gut," says Stroebel, adding that a young user may have confided in the adult or feel understood in a way they've never experienced before. Expecting youth to recognize a manipulative dynamic is an unreasonable burden, says Stroebel. 

Even when a minor takes action, Thorn's research shows that one in two youth who block or report someone say they were recontacted by the user, either on a different platform or from a new account created with another email address. In half of such cases, the minor experiences continued harassment. Stroebel says that ban evasion is "far too common." 

How to handle online exploitation 

Coffren says that youth who've been exploited online should tell a trusted parent, adult, or friend. The minor or someone close to them can make a report to the CyberTipline, which assesses the information and shares it with the appropriate authorities for further investigation. (The center's 24-hour hotline is 1-800-THE-LOST.) 

Coffren emphasizes that minors who've been exploited have been tricked or coerced and should not be treated by law enforcement as if they have violated the law.

She also wants youth to know that nudes can be removed from the internet. NCMEC's Take It Down program is a free service that lets people anonymously request the removal of nude or sexually explicit photos and videos taken of them before age 18 by adding a digital fingerprint, or hash, as a way of flagging that content. NCMEC shares a list of fingerprints with online platforms, including Facebook, TikTok, and OnlyFans. In turn, the platforms can use the list to detect and remove the images or videos. 

Coffren urges youth who've been exploited to stay hopeful about their future: "There is a life after your nude images circulate online." 

But reducing the stigma of exploitation also requires the public to confront how the digital ecosystems youth participate in aren't designed for their safety and instead expose them to bad actors eager to manipulate and deceive them. 

"We have to accept that children are going to weigh the pros and the cons and maybe make the wrong decision," says Coffren, "but when it's on the internet, the grace isn't given to make mistakes as lightly or as easily."

If you are a child being sexually exploited online, or you know a child who is being sexually exploited online, or you witnessed exploitation of a child occur online, you can report it to the CyberTipline, which is operated by the National Center for Missing Exploited & Children.

Categories: IT General, Technology

California paid millions to access a mental health app. It wasn't safe for users.

Mashable - Tue, 03/26/2024 - 10:00

On a late spring day in 2019, Mimi Martinez McKay, then deputy director for the Los Angeles County Department of Mental Health, saw that a user on Twitter, the platform now known as X, had been tagging the county in posts that made alarming claims about the safety of 7 Cups, a popular emotional support platform. 

7 Cups, which operates as both a website and app, invites teens and adults to talk to someone online for free. Users give and receive emotional support, but they are discouraged from acting like a therapist. 

The year prior, the county had agreed to provide 7 Cups access to residents as part of a five-year, $101 million initiative known as Tech Suite that was designed to use innovative technology to connect California residents to mental health help. Some of these adults, who were clients of local departments of mental health, had complex behavioral health needs and were highly vulnerable. They lived with serious mental illness and may have experienced homelessness, substance abuse, and domestic violence. 

The Twitter/X user making claims about 7 Cups belonged to the platform's teen community and went by a pseudonym on both platforms. The user claimed that a teen friend also on the platform had been manipulated into sharing child sexual abuse content and material with an adult user outside of the United States. McKay, who said she controlled the department's Twitter/X account, was appalled by the claim. 

This story is part of our investigation into the emotional support platform 7 Cups and the growing marketplace for apps and platforms that pair people with someone who is supposed to be a compassionate listener. The series explores a failed experiment between the state of California and 7 Cups, as well as the myriad risks of seeking emotional support online from strangers. These dangers can include the manipulation of vulnerable youth and targeted abuse and harassment. The series also includes an analysis of why it's so hard to stop online child exploitation, and looks at solutions to make platforms safer.

What happened next was never publicly detailed by California officials in charge of the initiative, despite multiple public evaluation reports written about the initiative. The incident in L.A. County factored into the abrupt termination of 7 Cups' contract with the state several months later, according to sources with knowledge of the events. By the end of 7 Cups' contract, the company received an estimated $6.7 million.

The California Mental Health Services Authority (CalMHSA), which contracted with 7 Cups to provide some state residents with access to the platform, told Mashable in a statement that, "After the initial phase, it was determined that the services, tools, and processes used by 7 Cups did not meet the needs of the target populations for the project."

In fact, CalMHSA ultimately reached a settlement with 7 Cups after its CEO, Glen Moriarty, disputed the contract's termination. That settlement, which Mashable obtained through the Public Records Act, contained a confidentiality clause prohibiting either party from disclosing the settlement's existence, terms, and provisions. The clause had a few exceptions, including if disclosure was necessary to comply with applicable law, rule, regulation, or policy of a governmental agency, like the Public Records Act.

As part of the settlement, 7 Cups paid CalMHSA $460,382 to satisfy its "financial obligation" to the agency, and CalMHSA paid the company $309,277 as "consideration."

Though the settlement agreement did not identify the reasons why 7 Cups' contract had been terminated, an investigation conducted by Mashable found significant concerns regarding trolling and inappropriate comments, including unwelcome sexual and explicit language, as well as a nontherapeutic discussion of child sex abuse on the platform. 

In one unrelated July 2018 criminal case that Mashable reviewed, a 42-year-old Butler, Pennsylvania, man named Anthony Joseph Smith posed as a 15-year-old boy to gain access to 7 Cups' teen community. He met a 14-year-old user and coerced her into sending him child sexual abuse material of herself. Her parents learned of their communication and alerted the authorities. Smith was jailed in 2019, but California officials awarded 7 Cups its contract prior to the criminal acts and appeared to have no knowledge of the case. 

Still, the University of California, Irvine, was paid by CalMHSA to evaluate Tech Suite. Its public reports make little mention of safety issues related to 7 Cups, and do not explain why the company lost its contract. Dr. Dara Sorkin, a principal investigator and professor in the school of medicine at UC Irvine, declined to review fact-checking sent by Mashable or answer questions related to Tech Suite. CalMHSA declined to comment on whether it prohibited the UC Irvine researchers from including detailed safety concerns in its public reports, though the settlement's confidentiality agreement may have effectively barred the researchers from addressing them.

David Loy, legal director of the California-based First Amendment Coalition, reviewed the settlement for Mashable. Loy, who is an open government litigator, noted that CalMHSA didn't violate the law by including a confidentiality agreement in the settlement, because it did not prevent the agency from disclosing a copy in response to a public records request. 

However, the clause impaired the public's ability to know what transpired and hold the agency accountable by effectively forcing the public to request a document that it didn't know existed in the first place, according to Loy. The clause also included stipulations that CalMHSA had to notify 7 Cups prior to disclosing the settlement, which could allow 7 Cups to sue to stop the document's release.

"I think it's a very bad thing," Loy said of the confidentiality clause. "I don't think government agencies should ever be allowed to do this."

Dr. Matt Mishkind, a researcher who studies technological innovation in behavioral health as deputy director of the University of Colorado's Helen and Arthur E. Johnson Depression Center, said the failure to disclose issues or negative outcomes in a project like California's may lead to further user harm, if consumers are never informed of the possible risks of using a platform. Mishkind was not involved in Tech Suite or familiar with it prior to speaking to Mashable. 

"When public dollars are used, part of what we learn should be in the public good..." - Mimi Martinez McKay

The lack of transparency also denied academia and industry the opportunity to learn from critical mistakes in the rapidly growing market of mental health apps. In recent years, a number of emotional support platforms have launched, including Wisdo Health, HearMe, and Circles. 

"When public dollars are used, part of what we learn should be in the public good, and when it's not, then why isn't it?" McKay told Mashable. 

As venture capital funds and investors pour tens of millions of dollars into products with the noble intention of offering free or affordable emotional support to vulnerable people, and employers and insurers look to bring them into their portfolio of behavioral health offerings, California's experiment with 7 Cups prompts serious questions about user safety.

To reconstruct what happened, Mashable reviewed meeting agendas and minutes routinely made publicly available by the Mental Health Services Oversight & Accountability Commission, a body that reviews billions of dollars in mental health spending; quarterly and annual evaluation reports of the project; and government correspondence, invoices, and internal documents and meeting minutes related to the initiative, obtained through public records requests submitted in 2019 and 2020 by Cal Voices, a nonprofit advocacy organization. Mashable also interviewed several individuals familiar with the project's execution, who requested anonymity because they weren't authorized to discuss their experiences or had signed a nondisclosure agreement with 7 Cups. The documents that Cal Voices received in response to its public records requests did not include the settlement, according to the group's executive director, Susan Gallagher.

These events, which have not been previously reported in detail, offer yet another cautionary tale about the risks of leveraging unregulated digital technology to solve some of the thorniest problems in the mental health delivery system.

"A great opportunity" 

The Innovation Technology Suite Project (Tech Suite) initially featured two apps: 7 Cups and Mindstrong, a now-defunct platform that aimed to use artificial intelligence and digital biomarkers to predict the onset of mental illness. 

Both had a persuasive and powerful ambassador in Dr. Thomas Insel, the former director of the National Institute of Mental Health (NIMH) from 2002 until 2015. He left the NIMH and joined Verily, an Alphabet precision health company. Insel then served as Mindstrong's president, and as an advisory board member for 7 Cups when both companies were awarded their contracts, according to a 2018 report published by the state

In February 2017, Insel, then at Verily and a 7 Cups advisory board member, made a presentation that included information about the emotional support company in a public meeting to the Mental Health Services Oversight & Accountability Commission, which reviews certain streams of taxpayer funding earmarked for mental health interventions, including for innovation projects. Insel told Mashable that the commission invited him to make a presentation about innovation, and that he was never part of 7 Cups outside of agreeing to be an advisor. At the time, Insel's daughter was employed by 7 Cups as its director of clinical initiatives. 

A few months later, members of the commission participated in a daylong meeting at Google-Verily headquarters to explore innovation in the mental health sector, according to three stakeholders who attended or knew of the event. 

Unlike a typical commission meeting wherein stakeholders have an opportunity to review materials in advance and comment on specific mental health programs or proposals, this invite-only gathering was focused on broader discussions about innovation. One participant recalled breaking into small groups of "innovation incubators" in order to create prototypes of apps they might use. 

By October 2017, the Los Angeles County Department of Mental Health presented to the commission a proposal for a multi-county innovation project to "work with one or more technology companies with experience with virtual mental health care platforms." Among other goals, the project aimed to improve access to mental health services and boost social connectedness. The proposal explains that planning for the initiative began in June, after the commission's gathering at Google-Verily. 

The state quickly put out a request for proposals. By spring 2018, both 7 Cups and Mindstrong were selected as the only two vendors for a three-year $35 million initiative. (Tech Suite was eventually renamed Help@Hand and its budget increased to $101 million as more counties joined.)  

Financing for the ambitious effort came from a so-called millionaire tax levied on high earners. Approved by voters in 2004, Proposition 63 was designed to generate revenue for mental health prevention efforts and treatment. The funding allocated through Prop. 63 varies from year to year, depending on state revenue. It's estimated to raise $3.4 billion in the 2023-24 fiscal year. Historically, counties had been required to spend 5 percent of the revenue on innovative approaches to mental health, like Tech Suite. If, after a period of time, counties did not use their innovation funding, it would go back to state coffers for distribution to other counties.

In March, Californians passed a proposition designed to overhaul how Prop. 63 dollars are spent. There will no longer be a mandate for innovation, and counties will be required to spend two-thirds of the tax revenue on housing and round-the-clock services for people who are unhoused and experiencing mental illness.

When 7 Cups received its Tech Suite contract, the company's founder and CEO, Glen Moriarty, celebrated the news in a platform forum on June 18, 2018. He shared that the platform would be focused on supporting foster youth, new mothers, high school students, members of the Deaf and hard of hearing community, and those already receiving mental health care. 

"What I love about this is that, not only will we be able to reach more people in California, this partnership will bring extra resources for 7 Cups," he wrote. "It will give us a great opportunity to improve the 7 Cups website and app, fix bugs more quickly and hire new team members. Well [sic] be able to provide better all around care to everyone around the world." 

Shocking safety issues on 7 Cups

Before 7 Cups was terminated from the project, it received an estimated $6.7 million over 13 months to help Californians in populous and rural counties, including Los Angeles, Orange, Kern, and Modoc, connect with someone who was supposed to be a compassionate listener, according to Cal Voices' analysis of financial data from the documents obtained via a public records request. Ideally, the service would help people, particularly rural residents, feel less isolated, and improve their mental health and emotional well-being.  

Yet what occurred as the project unfolded was far more complex, and troubling. 

Prior to viewing the tweets that made concerning claims about 7 Cups' safety, Mimi Martinez McKay, of the L.A. County Department of Mental Health, had already independently developed her own reservations about the platform after testing it earlier in the year. 

"In my own experience as a user of 7 Cups, I did not feel that the people were professional," McKay told Mashable.

"In my own experience as a user of 7 Cups, I did not feel that the people were professional." - Mimi Martinez McKay

She also worried that L.A. County residents wouldn't be able to easily access a list of local resources where they could receive more formal help, including therapy. 

McKay reviewed the tweets, which included allegations that an adult 7 Cups user had groomed a teen user of the platform for a romantic relationship. 

McKay exchanged emails with the X user, noting to them that "we take these concerns you've shared with us very seriously." Mashable reviewed documentation of the correspondence and verified the identity of the anonymous user. 

Mashable couldn't vet the specific claim of online grooming presented in the tweets because the user being referenced as a victim was anonymous. However, a separate Mashable investigation into 7 Cups identified key lapses in the platform's approach to safety, like the ability of banned users to quickly return to the platform using burner accounts and regular experiences of harassment and abuse, along with multiple anecdotal reports of concern that adult users were attempting to groom minors

McKay believes she shared the claims directly with a state official overseeing the initiative at the California Mental Health Services Authority, or CalMHSA. She recalls that the agency took the claims seriously. The state documents obtained through a public records request by Cal Voices appear to substantiate McKay's account of alerting CalMHSA to the incident on X. 

Invoices submitted to CalMHSA by Murphy, Campbell, Alliston & Quinn (now Quinn Covarrubias), the Sacramento-area law firm it consulted on matters related to Tech Suite, show charges related to discussions of safety on 7 Cups, held in the days after the anonymous tweets were posted. 

McKay, who was fired by the L.A. County Department of Mental Health in 2020, settled a wrongful termination suit against the county in 2022. McKay partly attributed her termination to questioning department practices, including some related to the Tech Suite project.

What California officials didn't tell the public

This wasn't the first time that CalMHSA or its lawyers fielded concerns related to Tech Suite and 7 Cups. 

While issues had been raised in late 2018 and early 2019, the situation escalated in April 2019. A group of staff and contractors in the participating counties had been testing 7 Cups, according to documents obtained through the records request and to sources familiar with the process. The testers messaged with other users and participated in public chat rooms. Some of those testers were disturbed by what they encountered, according to sources with knowledge of their concerns.

The testers noticed inappropriate handle names and profile photos. Some received unwanted sexual messages. Bad actors name-called and taunted people engaging with the platform. Inappropriate behavior allegedly included descriptions of sexual fetishes and domestic violence. One tester reported being asked to trade photos of feet for sexual gratification. Some of the testers messaged with users who requested to meet in person for a sexual encounter. Mashable reviewed documentation of these concerns.

One tester reported messaging with a 7 Cups user who described a hypothetical act of child sex abuse in graphic detail. Mashable confirmed the details of this incident with sources who were familiar with what happened. 

Almost none of this appeared in the quarterly and annual public evaluation reports produced by UC Irvine. These documents provided the only thorough public accounting of the project's progress.

In a quarterly report that covered September 2018 through February 2019, the researchers briefly recommended that 7 Cups "vet listeners more carefully." The evaluators themselves documented a user who used a Confederate flag as a symbol, but that is the only detail included as an example of poor listener quality. 

By April 2019, Kern County, home to the central Californian city of Bakersfield, discontinued use of 7 Cups, according to the county's report on the project, which was obtained via Cal Voices' public records request. 

That report noted that Kern County felt 7 Cups was a "poor fit" for its community. Some of the county's mental health clients had histories of domestic violence, sexual abuse, and other types of trauma. The report noted multiple frustrations with the app, including: concerns about the "quality" of listeners; "unmoderated chat rooms, unmoderated listeners"; and difficulty "screening out dangerous folks" from becoming listeners. Additionally, the county was concerned about endorsing the app without solving these safety issues. 

Though the UC Irvine researchers were in regular contact with Kern County officials during this time period according to its own reporting, these problems were omitted from their quarterly evaluations. 

When 7 Cups was terminated from the project altogether, in August 2019, it merited only a single-sentence mention in the quarterly report, with no acknowledgment of the alarming safety problems that had surfaced. In their annual report, published in February 2020, UC Irvine researchers again omitted why 7 Cups had lost its contract with the state. The researchers did not go into detail about safety concerns. 

Sources familiar with the factors that contributed to 7 Cups' contract cancellation, who asked to remain anonymous, told Mashable that safety issues were critical, in addition to concerns that the company was failing to implement counties' demands for specific features quickly enough. 

Some of those sources also noted that California officials, including those at the county level, had naive or unrealistic expectations of technology and the platform, and didn't understand how long it takes to iterate new ideas. At the same time, county officials believed 7 Cups offered a specific type of high-quality peer support, and were frustrated to learn that it largely did not, according to sources familiar with their experiences.

Moriarty told Mashable in an email that he was not made aware of the concerning safety issues identified by the county testers, and that the company ultimately built the desired functionality into the platform. 

"I regret that we were not able to serve the counties in the way we had envisioned," he said. 

Even commissioners who served on the Mental Health Services Oversight & Accountability Commission were puzzled by 7 Cups' abrupt removal from the project. 

In the only public meeting on Tech Suite's progress, held in Sacramento in February 2020, commissioner Dr. Itai Danovitch said that he held the project to a "high standard" and asked whether it was "on target," according to the meeting minutes

"There are some things that were significant elements of past presentations, such as 7 Cups, that have disappeared with no explanation." - Dr. Itai Danovitch, MHSOAC Commissioner

"One of the challenges in answering this question is that every presentation on this project has been completely different," Danovitch said, in the minutes' summary of his comments. "There are some things that were significant elements of past presentations, such as 7 Cups, that have disappeared with no explanation."

Another commissioner asked the meeting's presenters to confirm that 7 Cups was no longer part of the project. At this point, CalMHSA had agreed to the settlement with 7 Cups and was beholden to its confidentiality agreement. The commission was not a party to the settlement, according to its executive director Toby Ewing.

Jeremy Wilson, program director and public information officer at CalMHSA at the time, confirmed that 7 Cups wasn't a vendor. He explained that "it was determined by the counties that the peer chat product was not going to fit the need for the counties on this project," according to the minutes. He also noted that the company had chosen not to apply for a second round of proposal requests issued in September 2019 to identify new products for the initiative. 

Tech Suite was never put on a commission meeting agenda again, despite efforts by advocates to revisit the topic. Ewing told Mashable that the commission will not consider putting Tech Suite back on the agenda "at this time" and that the body was focused on "pressing needs" like school mental health and strengthening suicide prevention strategies.

Susan Gallagher, executive director of Cal Voices, was one of the advocates who asked the commission to revisit the initiative. She told Mashable that Cal Voices and other stakeholder groups had long raised concerns about Tech Suite with the commission on issues like safety, privacy, and usefulness to consumers. The nonprofit made its first public records request about Tech Suite in 2019, and submitted two requests in 2020 to obtain additional documents.  

In a January 2022 letter to the commission, Gallagher urged the body to "seriously assess the outcomes and budget" of the initiative. Before speaking with Mashable, Gallagher was unaware of the more serious safety concerns that emerged after testers used 7 Cups. 

"It's not really that surprising to me, although it is very devastating to think that could've happened to vulnerable people who we were supposed to be helping," she said. 

Dr. Matt Mishkind told Mashable that the failures to fully disclose the concerns over 7 Cups deprived consumers the opportunity to better inform and protect themselves from harm on the platform, or other similar platforms, of which there are now several. 

Mishkind added that state officials had an ethical obligation to inform the public. 

"When it's paid for with taxpayer dollars, I think there should be transparency there. People should know why something, especially something pretty big, did not work, and also what the recommendations and solutions are to it," said Mishkind. "I think that's how it should work. I don't know that I'm so naive to say that's how it works." 

What happened at 7 Cups

In a non-public report submitted to county leadership in 2019, Moriarty acknowledged that the company had to address safety concerns. He described moving from a "reactive approach" of using a list of keywords that triggered human flagging and censoring of inappropriate content to "far more advanced monitoring." The company reduced access for "guest" and "unverified" users, or people who created an account without using an email address to sign up for the service. 

7 Cups also began using a "suspicion-scoring mechanism" in conjunction with "trust scoring" to detect and sanction behavior associated with acting in "undesirable ways."  

"We continue to run experiments now to balance the tension between making it too difficult for people to get help, while also increasing their safety," he wrote. 

Mashable's recent reporting on 7 Cups indicates that some of the same problems continue to plague the platform. Users who exhibit trolling behavior can still access the platform, even after they've been banned. Last summer, one such user told some members to kill themselves, according to a source familiar with current safety issues on the platform. 

Moriarty told Mashable that the company "continuously" improves safety measures and that there were fewer related challenges than five years ago, when 7 Cups' contract was terminated. 

Moriarty's tone in the 2019 report was optimistic: "We believe we have a lot of important work to do and appreciate the honor of continuing the work we started." 

7 Cups received a 30-day notice of termination of its original contract on Aug. 31, 2019, according to minutes from a September 2019 leadership meeting attended by the county representatives and CalMHSA staff. 

The settlement with CalMHSA went into effect a few months later, on Nov. 1. At the end of October, Moriarty laid off 10 members of 7 Cups' staff. The decision blindsided and shocked the affected employees, according to sources with knowledge of the events.

In a 7 Cups forum post on Nov. 6, Moriarty announced that "we had to let some people go." Though he'd previously posted about the California initiative, he made no mention of it in his announcement. He did, however, note that the sitewide cost-cutting included shuttering unmoderated group chat rooms, for safety reasons. 

Likely anticipating disappointment from users who'd become accustomed to accessing chat rooms at any time, Moriarty tried to appeal to a sense of unity.

"If this is where you want to be and you are open to our attempts to increase safety while also surviving the challenges of a startup, then we have each others [sic] backs and will steer into problems together solving them one at a time," Moriarty wrote.

Despite the 7 Cups failure, the state of California hasn't given up on digital mental health products that incorporate some form of peer support. In January, the state's Department of Health Care Services launched access to an app for 13- to 25-year-olds called Soluna. Users can sign up anonymously to access, among other well-being features, moderated forums where they can "post a question, get or give advice, or just chat about whatever's on your mind."

What to learn from California 7 Cups' experiment 

Given that the public never learned specifically why 7 Cups lost its contract with California, the company has not faced any widespread criticism or fallout.

That same year, however, Moriarty launched the nonprofit 7 Cups Foundation "to support expanding access to quality, affordable, and innovative mental health care through volunteers, clinicians, and technology" with a sizable donation from the nonprofit, California-based medical system Sutter Health that the 7 Cups Foundation ended up returning in 2020.

Moriarty said the funding was meant to provide education for new mothers. The foundation paid $450,000 back to Sutter Health, which Moriarty said occurred after John Boyd, the organization's former CEO for System Mental Health Services, requested the remainder of the unspent funds. In 2018, when 7 Cups received its contract to serve California counties, Boyd served on the state commission that approved funding for the project. Sutter Health declined to comment on the matter.

As for 7 Cups users, some have continued to anonymously share their own negative experiences on the platform, which include alleged trolling and abusive behavior. 

When Mashable asked Moriarty about some of these complaints in February 2023, prior to learning about Tech Suite, he noted that the platform's massive size and reach meant negative experiences were bound to arise. 

"No matter how hard we try or the changes we make, ultimately people are people and they make mistakes and/or behave in unhelpful ways," Moriarty wrote in an email, adding that the same can occur with licensed professionals. "Humans — licensed or not — can be messy." 

"No matter how hard we try or the changes we make, ultimately people are people and they make mistakes and/or behave in unhelpful ways." - Glen Moriarty, CEO of 7 Cups

The failure to disclose what happened with 7 Cups has also meant less scrutiny for similar startups. 

Wisdo Health, a competitor to 7 Cups that counts Dr. Insel as an advisor, struck a deal last year with a Colorado mental health nonprofit to make the platform available to adults in the state. A press release announcing the partnership said there would be a particular emphasis on reaching the state's Medicaid members. 

Though Mashable didn't learn of serious safety concerns related to Wisdo Health during the course of reporting, it did find that the industry at large hasn't resolved or even made clear to consumers the risks of using emotional support platforms. 

Nor is there evidence to indicate that such platforms are as good or better than other mental health interventions, like therapy or even computer-based cognitive behavioral therapy, a form of treatment that can help people better manage their thoughts and feelings. 

Still, the Department of Health and Human Services included Wisdo Health in a list of resources for improving social connection, a clear sign that power brokers take the model seriously. 

UC Irvine's lack of transparency has had ripple effects, too. 

In January 2023, the UC Irvine researchers, including Dr. Stephen Schueller, published a paper on their work evaluating Tech Suite in the journal JMIR Formative Research. The authors made a brief mention of safety issues, but without naming 7 Cups. The paper included an acknowledgment that CalMHSA "reviewed the manuscript to ensure its confidentiality."

The authors recommended chat forum monitoring of bullying and abuse.

"[S]ervice providers also shared concerns about inappropriate web-based interactions that could take place on peer support platforms and chat rooms; one of the service providers stated, 'But these chat rooms are not monitored, and so anyone can pop on and say a number of horrible things, and no one’s there to monitor that behavior. And we didn’t know that,''' the researchers wrote. 

Until last summer, when it shut down after losing its funding, Schueller led One Mind PsyberGuide, a nonprofit website designed to help consumers vet digital mental health tools. The PsyberGuide's review of 7 Cups contained no safety warnings, but does note that one of the app's target audiences is adolescents. The app received a 3.64 credibility rating out of 5. 

A 2022 study on using mental health apps for distress during the pandemic selected 7 Cups as one of its interventions partly because it was "highly rated" by PsyberGuide. Schueller told Mashable in an email that while he could not comment on 7 Cups and Tech Suite, any safety warnings would have been reflected in the PsyberGuide's transparency rating, which was not conducted for 7 Cups. "So nothing involved in our credibility review would speak to safety issues," he wrote. 

Insel and Moriarty told Mashable that they hadn't spoken to each other in years. Insel's daughter left the company in May 2019.

Together, Insel and his daughter have since founded Humanest Care, along with Twisha Anand, the former head of operations at 7 Cups. The startup offers mental health tools, courses, and counseling services, primarily on college campuses. Insel has described the company as "building an online community empowering people to get help and give help."

Insel wrote in an email to Mashable that he didn't know about 7 Cups' settlement with CalMHSA, acknowledging that, at the time, "there was less awareness of how online peer support could become toxic."

He added that he believes the U.S. needs a regulatory infrastructure for digital mental health or, at the very least, best practices so an agency like CalMHSA will know how to rigorously evaluate a digital mental health company.

"I don't know what happened with the [T]ech [S]uite in CA," Insel wrote Mashable, "but there should be some lessons learned."

If you are a child being sexually exploited online, or you know a child who is being sexually exploited online, or you witnessed exploitation of a child occur online, you can report it to the CyberTipline, which is operated by the National Center for Missing Exploited & Children.

Categories: IT General, Technology

Buying a Tesla? The company will first take you on self-driving ride

Mashable - Tue, 03/26/2024 - 09:31

Customers who purchase a Tesla will now get a demonstration of the car's self-driving capabilities before being handed over their vehicle.

According to a new report by Bloomberg, the order comes directly from CEO Elon Musk.

"Going forward, it is mandatory in North America to install and activate FSD V12.3.1 and take customers on a short test ride before handing over the car," wrote Musk in an internal memo sent to Tesla employees.

Tweet may have been deleted

By "FSD V12.3.1" Musk means the latest version of Tesla's Full Self-Driving package, an optional $12,000 upgrade which, according to Tesla, enables your car to "be able to drive itself almost anywhere with minimal driver intervention." Following a closed trial, Tesla opened up FSD as beta software to everyone in the U.S. in 2022, though the system still hasn't reached the levels of autonomy long promised by Musk.

In the memo, Musk recognizes that the practice of having every customer go through a test ride will "slow down the delivery process," but he reasons that "almost no one actually realizes how well (supervised) FSD actually works."

The idea behind the move, which will sure make life harder for some Tesla staffers, is likely that more people will choose to dish out the $12,000 and upgrade to FSD after seeing what it can do.

SEE ALSO: Tesla Model 3 'Ludicrous' will be more than just a speedier M3, new leaks show

Buyers who do opt in for the package should be wary of Musk's promises regarding FSD. In 2020, Musk claimed that, after regulatory approval, the value of FSD will be "somewhere in excess of $100,000," and that owners will be able to send their Teslas to work as robotaxis, effectively earning them money. Years have passed, though, and FSD still isn't close to that level of autonomy, with regulators increasingly looking into Tesla's self-driving features in recent years.

Categories: IT General, Technology

Social media now unlawful for kids under 14 in Florida

Mashable - Tue, 03/26/2024 - 08:41

Florida has just enacted a new law restricting social media access for children. Those aged 15 and under must now obtain their parents' consent to have a social media account, while children under 14 aren't allowed to have one at all. That's no more TikTok, no more Snapchat, no more Facebook, and no more Instagram.

Governor Ron DeSantis signed House Bill 3 (HB 3) on Monday, after previously vetoing similar legislation earlier this month. At the time, he cited the imminent arrival of a "superior" bill that would "[support] parents' rights." It appears that HB 3 is that bill.

SEE ALSO: Instagram Reels reportedly shows sexual content to users who only follow children

"Social media harms children in a variety of ways," said DeSantis in a statement on Monday. "HB 3 gives parents a greater ability to protect their children."

The new law will go into effect from Jan. 1 next year, which gives Florida's kids a bit of time to either persuade their parents to sign their permission slip, or download all their posts before their accounts are deleted.

Many social media platforms already impose age requirements in their terms of service. Facebook, Instagram, and Snapchat all require users to be at least 13 years old, while TikTok provides users aged 13 and under with a "curated, view-only experience… that includes additional safeguards and privacy protections." HB 3 will impose a higher legal age limit of 14, as well as charge penalties to social media platforms for any violations.

How to legally use social media if you're a child in Florida

Under Florida's new legislation, social media platforms must prohibit children under the age of 14 from creating accounts, as well as delete any such accounts which already exist. The TikTok accounts of Florida's 13-year-olds may not be saved by them simply lying about their age, either. Accounts that are treated or categorised as belonging to a user likely under the age threshold must also be terminated. As such, accounts could be deleted if the platform's algorithm determines that a user's preferred content indicates they are 13 or younger.

This doesn't mean children are immediately set loose to flood Snapchat with Stories on their 14th birthday. While children aged 13 and under are completely banned from social media, 14- and 15-year-olds can only have accounts if their parent or guardian provides consent to the platform. If a child can't obtain such consent, it's strictly emails and group chats for them until they turn 16. A caregiver can also revoke consent and request a company straight up delete their child's social media accounts. 

Fortunately, HB 3 explicitly excludes platforms that are exclusively for sending content to clearly identified recipients, meaning email and direct messaging services are still available to all regardless of parental approval. The legislation instead focuses on social media platforms where content can be posted publicly, including those such as Snapchat which facilitate both public posts and private messaging.

Despite the new law, Florida's government won't be hunting down middle school students doing TikTok dances. Rather, it's the social media platforms that will face consequences should any kids create accounts. Any social media platform that doesn't comply with HB 3's social media age restrictions can be fined up to $50,000 per violation, and may also be sued on the child's behalf for up to $10,000 in damages.

Keeping Florida's kids off TikTok isn't HB 3's only mandate. The legislation also imposes age verification requirements for pornographic websites, following similar laws in other states.

Categories: IT General, Technology

Get a Windows 11 Pro lifetime license for under £20

Mashable - Tue, 03/26/2024 - 06:00

TL;DR: Microsoft Windows 11 Pro is on sale for £19.80, saving you 87% on list price.

You don't necessarily have to buy a new computer to upgrade your tech arsenal. Sometimes, all it takes is installing a brand-new operating system to boost your productivity and enhance your computer's security. If you happen to have a Windows PC that meets the necessary requirements, consider breathing new life into it by upgrading to Microsoft 11 Pro.

For a limited time, the coveted Microsoft 11 Pro license is on sale for £19.80. This OS boasts a bunch of nifty features, including a new-and-improved interface, advanced security, and an elevated gaming experience.

Microsoft 11 Pro is great for modern professionals, especially those working in a remote or hybrid setup. For instance, if you're a stickler for security, the OS packs BitLocker device encryption, meaning you can lock your device remotely so no one else can access it and its contents. Windows Information Protection is included, too, allowing you to remotely manage your system to prevent data leaks. Plus, you can also take advantage of biometrics login, Smart App control, advanced antivirus defenses, and more. 

In terms of usability, enjoy Windows 11 Pro's new interface, which includes snap layouts, seamless redocking, better voice-typing, and a more powerful search experience. It's primed for play, too, thanks to DirectX12 Ultimate, which bolsters the quality of your computer's graphics, resulting in more immersive gaming. 

You'll have all these and more with a one-time purchase of the license. You can install it on up to two devices, provided that they have 4GB RAM and 40GB of hard drive space.

For a limited time, you can grab a lifetime license to Windows 11 Pro for only £19.80.

Opens in a new window Credit: Microsoft Microsoft Windows 11 Pro (Lifetime License) £19.80 at the Mashable Shop Get Deal
Categories: IT General, Technology

The best gaming headset for the Xbox Series X

Mashable - Tue, 03/26/2024 - 05:59

If you're serious about your Xbox — and what Xbox owner isn't? — there's an arsenal of stuff you need from a gaming session. A console, obviously — in this case, the Xbox Series X — plus a controller, gaming chair, and if you’re putting in a long shift, maybe some energy drinks, snacks, and comfy clothes. But there’s one thing missing from this list, one item that’s almost as crucial as the controller. That’s right: the gaming headset.

We assume that gamers understand the importance of a high-quality gaming headset. But just in case there is any doubt — or you need some persuasion to upgrade — we're happy to outline why a headset is almost certainly the most important accessory for your console.

It’s worth knowing about, because buying a new gaming headset isn’t as straightforward as it sounds. It’s something very personal. Some headsets will suit some gamers better than others. Here’s a quick guide to get you started.

Do you need a gaming headset?

More than just protecting those around you from the racket of your campaign, gaming headsets allow you to fully immerse in your game. They block out background noise and produce an accurate, detailed, and powerful sound that puts you in the centre of the action. The promise of a more immersive experience is enough for most gamers, but there is another important reason for considering a gaming headset. Nowadays, a headset can be the difference between success and failure. Experienced gamers will know that being able to pick up on subtle sounds is absolutely vital at crunch time, especially with games like Call of Duty and Fortnite. If you can heat footsteps and gunshots with pinpoint accuracy, you'll have a serious advantage over the competition.

What are the most important features in a gaming headset?

There are a lot of options out there. This is a good thing, but it also complicates matters. How are you supposed to know which headset will suit you best? What are the best headset brands? What features will boost gaming experience and performance? Here are a few features to consider:

  • Comfort — Nothing ruins a gaming session like an uncomfortable headset, even if it produces incredible sound. If it's not comfortable, it's not worth considering. Before you jump in and buy yourself something that looks great, we recommend considering how the headset will fit on your head. Is it a tight fit? How much does it weigh? Does it have any cooling technology? Do you need memory foam ear pads?

  • Durability — The majority of gaming headsets are made from plastic, which can mean a flimsy and cheap feel. If you're not able to find something with a durable metal or wire frame, you should at least try and pick a headset that has a good range of movement. This will offset some of the pressure that comes with a plastic build. The very best gaming headsets will come with durable materials like leather that will provide long-lasting comfort.

  • Isolation — Some headsets and games make use of very slight directional audio cues, which is why the quality of a headset's seal heavily impacts the quality of the sound you receive. It is therefore important to consider how the foam of the headphone pad will mold to the shape of your head. If the seal isn't tight, you should either find something that fits better, or purchase replacement pads that fit better.

  • Microphone — If the headset comes with a built-in mic (and it should) check for noise-cancelling functions, a convenient mute function, and a boom. This improves voice clarity, minimise outside distractions, and ensure your privacy. It's easy to overlook the microphone but you really shouldn't, especially if you regularly play cooperative online games.

  • Noise Cancellation — This technology uses little microphones that detect incoming noise and produce anti-soundwaves to counteract it. That means you can focus on the audio that matters.

  • Style — There are more important things to consider, but style is essential to many gamers. Most of the best gaming headsets offer multiple colour options, or may have RGB lights that flash and dance along to the gameplay, and you can usually find something that works for you, your setup, and your vibe.

  • Surround sound — This means accurate omnidirectional hearing, giving you a heads up when enemies are close. The best gaming headsets are able to map sounds to virtual 3D locations where calibrated speakers can precisely position the sounds for a more realistic listening experience. If something or someone is sneaking up on you from the left, you’ll hear the sound from that direction.

Keep these features in mind when you're shopping around for your next headset.

What are the best gaming headset brands?

We recommend big brands such as SteelSeries, Razer, Turtle Beach and Logitech. These brands are tough to beat when it comes to advanced features. You can still find value with some of the lesser-known brands, but we wouldn't recommend straying too far. The biggest gaming brands are popular for a reason.

What is the best gaming headset for Xbox Series X?

That’s a question only you can answer because it really depends on the kind of gamer you are, but there is a perfect gaming headset out there for you. We’ve done some of the hard work by tracking down the best headsets for the Xbox Series X. Most of these devices also will work well with your PS5, PC, or mobile device.

These are the best Xbox Series X headsets in 2024.

Categories: IT General, Technology

The best keyboard for upgrading your PC gaming experience

Mashable - Tue, 03/26/2024 - 05:58

Think of a home computer and you'll almost certainly picture a monitor and keyboard. The keyboard is the most basic of computer accessories. You simply can't have a PC without one. But that doesn't mean you have to settle for a basic keyboard.

To the untrained eye, a keyboard looks like, well, a keyboard. It's hard to tell a standard office-style keyboard from a high end esports model. But serious gamers know what sets the best keyboards apart. And not just RGB lighting. We're talking the kinds of features that will help you reach your peak gaming potential.

For those who aren't in the know, we've lined up this quick guide on gaming keyboards — not to mention a roundup of the very best models to consider.

Do you need a gaming keyboard?

If you're a casual gamer who only plays from time to time, you are probably going to be just fine with a standard keyboard. If you're a dedicated gamer who engages in competitive matches, or you put in serious campaign hours, a keyboard upgrade could make a big difference to your overall gaming experience.

Alongside your mouse, a keyboard is your number one gaming tool. So it pays to upgrade to something special. The best gaming keyboards are more responsive and react more quickly to give you an edge. When the difference between success and failure can come down to split seconds, high performance tech has an impact on your win rate.

What are the best features in a gaming keyboard?

These are the most important factors when it comes to picking a gaming keyboard:

  • Anti-ghosting — Anti-ghosting tech (the ability to capture simultaneous keystrokes without errors) has become a standard on gaming keyboards. All the keyboards listed below come equipped with some form of anti-ghosting tech. There are variations, however, in how many anti-ghosting keys can be used simultaneously. Some keyboards are more restrictive than others.

  • Backlighting — Gaming keyboards come equipped with either single colour backlighting, multi-colour RGB (Red, Green, Blue) backlighting, or no lighting at all. Some gaming keyboards even feature backlighting options that allow you to synchronise the lighting with the action on your screen. A truly immersive experience. 

  • Macro Keys — These are keys that perform a series of programmable commands in sequence. Essentially a handy shortcut depending on what game you're playing — a way to play more efficiently and economically. Keyboards come with dedicated macros keys, which you can assign command sequences to.

  • Switches — Keyboard switches (the bits that sit underneath each key) pick up what you put down. Most gaming keyboards come equipped with one of a variety of different options for mechanical or membrane switches. The switches register your keystroke, so you want them to function with speed and accuracy.

  • Type — Mechanical keyboards are sturdier, offer a more instant reaction, and tend to be the preferred option for gamers. Membrane keyboards are ideal for quieter performance and quicker movement across the keys. And optical keyboards respond faster and tend to last longer.

Additional features you should consider include ergonomics, hot-swappable keys, and design. The first step is to determine your gaming priorities, and whether they are specific to your favourite game or dictated by your budget. Then use this guide to pinpoint the perfect gaming keyboard to meet your unique gaming needs. 

What is the best gaming keyboard?

We've looked at all the key features — not to mention style, performance, and cost — to bring you a selection of your best keyboard options from top gaming brands like Razer and Corsair. We recommend that you check out everything in this list, and consider what will work best for you and your gaming world.

These are the best gaming keyboards in 2024.

Categories: IT General, Technology

How to Break Through When You’re Feeling Stuck

Havard Management Tip of the Day - Tue, 03/26/2024 - 05:01

We all feel stuck sometimes at work. Maybe you want to quit your job and start another career. Maybe you’re trying to build a new skill or habit. Maybe you’re searching for a new idea to pitch. How can you take steps to get un-stuck? Start by silencing your inner cynic. In the early stages […]

257257
Categories: Management

Wordle today: Here's the answer and hints for March 26

Mashable - Tue, 03/26/2024 - 03:00

Oh hey there! If you're here, it must be time for Wordle. As always, we're serving up our daily hints and tips to help you figure out today's answer.

If you just want to be told today's word, you can jump to the bottom of this article for March 26's Wordle solution revealed. But if you'd rather solve it yourself, keep reading for some clues, tips, and strategies to assist you.

Where did Wordle come from?

Originally created by engineer Josh Wardle as a gift for his partner, Wordle rapidly spread to become an international phenomenon, with thousands of people around the globe playing every day. Alternate Wordle versions created by fans also sprang up, including battle royale Squabble, music identification game Heardle, and variations like Dordle and Quordle that make you guess multiple words at once

Wordle eventually became so popular that it was purchased by the New York Times, and TikTok creators even livestream themselves playing.

Not the day you're after? Here's the solution to yesterday's Wordle.

What's the best Wordle starting word?

The best Wordle starting word is the one that speaks to you. But if you prefer to be strategic in your approach, we have a few ideas to help you pick a word that might help you find the solution faster. One tip is to select a word that includes at least two different vowels, plus some common consonants like S, T, R, or N.

What happened to the Wordle archive?

The entire archive of past Wordle puzzles used to be available for anyone to enjoy whenever they felt like it. Unfortunately, it has since been taken down, with the website's creator stating it was done at the request of the New York Times.

Is Wordle getting harder?

It might feel like Wordle is getting harder, but it actually isn't any more difficult than when it first began. You can turn on Wordle's Hard Mode if you're after more of a challenge, though.

Here's a subtle hint for today's Wordle answer:

The elected head of a city or town.

Does today's Wordle answer have a double letter?

There are no letters that appear twice.

Today's Wordle is a 5-letter word that starts with...

Today's Wordle starts with the letter M.

SEE ALSO: Wordle-obsessed? These are the best word games to play IRL. What's the answer to Wordle today?

Get your last guesses in now, because it's your final chance to solve today's Wordle before we reveal the solution.

Drumroll please!

The solution to Wordle #1011 is...

MAYOR.

Don't feel down if you didn't manage to guess it this time. There will be a new Wordle for you to stretch your brain with tomorrow, and we'll be back again to guide you with more helpful hints.

Reporting by Caitlin Welsh, Sam Haysom, Amanda Yeo, Shannon Connellan, Cecily Mauran, Mike Pearl, and Adam Rosenberg contributed to this article.

Categories: IT General, Technology

NYT Connections today: See hints and answers for March 26

Mashable - Tue, 03/26/2024 - 02:00

Connections is the latest New York Times word game that's captured the public's attention. The game is all about finding the "common threads between words." And just like Wordle, Connections resets after midnight and each new set of words gets trickier and trickier—so we've served up some hints and tips to get you over the hurdle.

If you just want to be told today's puzzle, you can jump to the end of this article for March 26's Connections solution. But if you'd rather solve it yourself, keep reading for some clues, tips, and strategies to assist you.

What is Connections?

The NYT's latest daily word game has become a social media hit. The Times credits associate puzzle editor Wyna Liu with helping to create the new word game and bringing it to the publications' Games section. Connections can be played on both web browsers and mobile devices and require players to group four words that share something in common.

Tweet may have been deleted

Each puzzle features 16 words and each grouping of words is split into four categories. These sets could comprise of anything from book titles, software, country names, etc. Even though multiple words will seem like they fit together, there's only one correct answer. If a player gets all four words in a set correct, those words are removed from the board. Guess wrong and it counts as a mistake—players get up to four mistakes until the game ends.

Tweet may have been deleted

Players can also rearrange and shuffle the board to make spotting connections easier. Additionally, each group is color-coded with yellow being the easiest, followed by green, blue, and purple. Like Wordle, you can share the results with your friends on social media.

Here's a hint for today's Connections categories

Want a hit about the categories without being told the categories? Then give these a try:

  • Yellow: Devices that make sounds when played

  • Green: Raising plants

  • Blue: To nurture

  • Purple: Star shine

Featured Video For You Connections: How to play and how to win Here are today's Connections categories

Need a little extra help? Today's connections fall into the following categories:

  • Yellow: Musical Instruments

  • Green: Plant Growths

  • Blue: Bring Up

  • Purple: Solar Emanations

Looking for Wordle today? Here's the answer to today's Wordle.

Ready for the answers? This is your last chance to turn back and solve today's puzzle before we reveal the solutions.

Drumroll, please!

The solution to Connections #289 is...

What is the answer to Connections today
  • Musical Instruments: BASS, BASSOON, HARP, RECORDER

  • Plant Growths: BLOOM, BUD, SHOOT, SPROUT

  • Bring Up: FOSTER, NURSE, RAISE, REAR

  • Solar Emanations: CORONA, FLARE, LIGHT, RADIATION

Don't feel down if you didn't manage to guess it this time. There will be new Connections for you to stretch your brain with tomorrow, and we'll be back again to guide you with more helpful hints.

Is this not the Connections game you were looking for? Here are the hints and answers to yesterday's Connections.

Categories: IT General, Technology

OpenAI is pitching Sora to Hollywood. Creatives are fighting back.

Mashable - Mon, 03/25/2024 - 22:03

OpenAI is ramping up plans for its AI video generator Sora — and that involves a charm offensive in Hollywood.

Details are still fuzzy, but we know the company is approaching filmmakers as well as studios. OpenAI CEO Sam Altman and COO Brad Lightcap are having "introductory conversations" with industry stakeholders, according to Bloomberg.

Unspecified "big name" directors and actors already have access to Sora, the report says. That's part as an effort to "encourage filmmakers to integrate its new AI video generator into their work."

SEE ALSO: OpenAI's GPT-5 release could be as early as this summer

Sora was unveiled in February — and though there hasn't been a public release yet, the announcement has raised concerns about the data that was used to train to model, and how it could impact the film industry.

As with ChatGPT, OpenAI hasn't been transparent about Sora's training data. But creatives already suspect Sora was trained by scraping art and videos without the knowledge or consent of their creators.

OpenAI is already facing several copyright infringement lawsuits including allegations of this practice with the large language models that power ChatGPT.

The use of AI video tools threatens to upend the film industry by replacing jobs that range from VFX professionals, to writers, and even to actors.

The recent strike by Hollywood writers' and actors' unions (WGA and SAG-AFTRA) strike sought contractual limits on the use of AI in writers' rooms. The unions also fought to create digital likenesses of actors that could be used in perpetuity without pay.

The WGA is voting soon on a tentative agreement that prevents AI content being used as source material for writers' rooms. SAG-AFTRA, in its contract with studios, won promises of compensation and credit for AI likenesses — but did not succeed in banning the practice altogether.

Already, images featuring generative AI have crept into films such as Late Night with the Devil.

'Artistwashing'

Meanwhile, OpenAI published a blog post full of "first impressions" from a select group of testers who are visual artists, filmmakers, and creative directors — which tells a different story.

The post features feedback from director Paul Trillo, production company shy kids, creative agency Native Foreign, artist August Kamp, creative director Josephine Miller, and AR/XR artist Don Allen Stevenson III. Unsurprisingly, the feedback published on OpenAI's blog was overwhelmingly positive.

Testers praised Sora's ability to create photorealistic videos from text prompts and without constraints. "Not restricted by time, money, other people’s permission, I can ideate and experiment in bold and exciting ways," said Trillo.

Tweet may have been deleted

But users on X were quick to point out OpenAI's controlled narrative. "Artistwashing: when you solicit positive comments about your generative AI model from a handful of creators, while training on people's work without permission/payment," wrote Ed Newton-Rex, CEO of ethical AI data sourcing non-profit Fairly Trained.

If OpenAI is gearing up to take on Hollywood, in other words, the company had better be prepared for cinematic drama.

Categories: IT General, Technology
Syndicate content

eXTReMe Tracker