Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2025-11-25 18:00:00| Fast Company

In recent weeks, OpenAI has faced seven lawsuits alleging that ChatGPT contributed to suicides or mental health breakdowns. In a recent conversation at the Innovation@Brown Showcase, Brown University’s Ellie Pavlick, director of a new institute dedicated to exploring AI and mental health, and Soraya Darabi of VC firm TMV, an early investor in mental health AI startups, discussed the controversial relationship between AI and mental health. Pavlick and Darabi weigh the pros and cons of applying AI to emotional well-being, from chatbot therapy to AI friends and romantic partners.  This is an abridged transcript of an interview from Rapid Response, hosted by the former editor-in-chief of Fast Company Bob Safian. From the team behind the Masters of Scale podcast, Rapid Response features candid conversations with todays top business leaders navigating real-time challenges. Subscribe to Rapid Response wherever you get your podcasts to ensure you never miss an episode. A recent study showed that one of the major uses of ChatGPT for users is mental health, which makes a lot of people uneasy. Ellie, I want to start with you, the new institute that you direct known as ARIA, which stands for AI Research Institute on Interaction for AI Assistance. It’s a consortium of experts from a bunch of universities backed by $20 million in National Science Foundation funding. So what is the goal of ARIA? What are you hoping it delivers? Why is it here? Pavlick: Mental health is something that is very, I would say I don’t even know if it’s polarizing. I think many people’s first reaction is negative, the concept of AI mental health. So as you can tell from the name, we didn’t actually start as a group that was trying to work on mental health. We were a group of researchers who were interested in the biggest, hardest problems with current AI technologies. What are the hardest things that people are trying to apply AI to that we don’t think the current technology is quite up for? And mental health came up and actually was originally taken off our list of things that we wanted to work on because it is so scary to think about if you get it wrong, how big the risks are. And then we came back to it exactly because of this. We basically realized that this is happening, people are already using it. There’s companies that are like startups, some of them probably doing a great job, some of them not. The truth is we actually have a hard time even being able to differentiate those right now. And then there are a ton of people just going to chatbots and using them as therapists. And so we’re like, the worst thing that could happen is we don’t actually have good scientific leadership around this. How do we decide what this technology can and can’t do? How do we evaluate these kinds of things? How do we build it safely in a way that we can trust? There’s questions like this. There’s a demand for answers, and the reality is most of them we just can’t answer right now. They depend on an understanding of the AI that we don’t yet have. An understanding of humans and mental health that we don’t yet have. A level of discourse that society isn’t up for. We don’t have the vocabulary, we don’t have the terms. There’s just a lot that we can’t do yet to make this happen the right way. So that’s what ARIA is trying to provide this public sector, academic kind of voice to help lead this discussion. That’s right. You’re not waiting for this data to come out or for the final, whatever academia might say, this consortium might say. You’re already investing in companies that do this. I know you’re an early stage investor in Slingshot AI, which delivers mental health support via the app Ash. Is Ash the kind of service that Ellie and her group should be wary about? What were you thinking about when you decided to make this investment? Darabi: Well, actually I’m not hearing that Ellie’s wary. I think she’s being really pragmatic and realistic. In broad brushstrokes, zooming back and talking about the sobering facts and the scale of this problem, one billion out of eight billion people struggle with some sort of mental health issue. Fewer than 50% of people seek out treatment, and then the people who do find the cost to be prohibitive. That recent study that you cited, it’s probably the one from the Harvard Business Review, which came out in March of this year, which studied use cases of ChatGPT and their analysis showed that the number one, four, and seven out of 10 use cases for foundational models broadly are therapy or mental health related. I mean, we’re talking about something that touches half of the planet. If you’re looking at investing with an ethical lens, there’s no greater TAM [total addressable market] than people who have a mental health disorder of some sort. We’ve known the Slingshot AI team, which is the largest foundational model for psychology, for over a decade. We’ve followed their careers. We think exceptionally highly of the advisory board and panel they put together. But I think what really led us down the rabbit hole of caring deeply enough about mental health and AI to frankly start a fund dedicated to it, and we did that in December of last year. It was really kind of going back to the fact that AI therapy is so stigmatized and people hear it and they immediately jump to the wrong conclusions. They jump to the hyperbolic examples of suicide. And yes, it’s terrible. There have been incidents of deep codependence upon ChatGPT or otherwise whereby young people in particular are susceptible to very scary things and yet those salacious headlines don’t represent the vast number of folks whom we think will be well-serviced by these technologies. You said this phrase, we kind of stumbled on [these] uses for ChatGPT. It’s not what it was created for and yet people love it for that. Darabi: It makes me think about 20 years ago when everybody was freaking out about the fact that kids were on video games all day, and now because of that we have Khan Academy and Duolingo. Fearmongering is good actually because it creates a precedent for the guardrails that I think are absolutely necessary for us to safeguard our children from anything that could be disastrous. But at the same time, if we run in fear, we’re just repeating history and it’s probably time to just embrace the snowball, which will become an avalanche in mere seconds. AI is going to be omnipresent everywhere. Everything that we see and touch will be in some way supercharged by AI. So if we’re not understanding it to our deepest capabilities, then we’re actually doing ourselves a great disservice. Pavlick: To this point of yeah, people are drawn to AI for this particular use case. So on our team in ARIA, we have a lot of computer scientists who build AI systems, but acually a lot of our teams do developmental psychology, core cognitive science, neuroscience. There are questions to say, why? The whys and the hows. What are people getting out of this? What need is it filling? I think this is a really important question to be asking soon. I think you’re completely right. Fearmongering has a positive role to play. You don’t want to get too caught on it and you can point historically to examples of people freaked out and it turned out okay. There’s also cases like social media, maybe people didn’t freak out enough and I would not say it turned out okay. People can agree to disagree and there’s plus and minuses, but the point is these aren’t questions that really we are in a position that we can start asking questions. You can’t do things perfectly, but you can run studies. You can say, “What is the process that’s happening? What is it like when someone’s talking to a chatbot? Is it similar to talking to a human? What is missing there? Is this going to be okay long-term? What about young people who are doing this in core developmental stages? What about somebody who’s in a state of acute psychological distress as opposed to as a general maintenance thing? What about somebody who’s struggling with substance abuse?” These are all different questions, they’re going to have different answers. Again, I feel very strongly that the one LLM that just is one interface for everything is, I think a lot is unknown, but I would bet that that’s not going to be the final thing that we’re going to want.


Category: E-Commerce

 

LATEST NEWS

2025-11-25 17:22:33| Fast Company

If youve chosen a target asset allocationthe mix of stocks, bonds, and cash in your portfolio youre probably ahead of many investors. But unless youre investing in a set-and-forget investment option like a target-date fund, your portfolios asset mix will shift as the market fluctuates. In a bull market you might get more equity exposure than you planned, or the reverse if the market declines. Rebalancing involves selling assets that have appreciated the most and using the proceeds to shore up assets that have lagged. This brings your portfolios asset mix back into balance and enforces the discipline of selling high/buying low. Rebalancing doesnt necessarily improve your portfolios returns, especially if it means selling asset classes that continue to perform well. But it can be an essential way to keep your portfolios risk profile from climbing too high. Where and how to rebalance If its been a while since your last rebalance, your portfolio might be heavy on stocks and light on bonds. A portfolio that started at 60% stocks and 40% bonds 10 years ago could now hold more than 80% stocks. Another area to check is the mix of international versus U.S. stocks. International stocks have led in 2025, but that followed a long run of outperformance for U.S. stocks, so your portfolio might lack international exposure. (Keeping about a third of your equity exposure outside the U.S. is reasonable if you want to align with Morningstars global market portfolio.) Other imbalances might exist. Growth stocks have gained nearly twice as much as value stocks over the past three years. You might also be overweight in specialized assets such as gold and bitcoin thanks to their recent run-ups. After assessing your allocations, decide where to make adjustments. You dont need to rebalance every accountwhat matters is the overall portfolios asset mix, which determines your risk and return profile. Its usually most tax-efficient to adjust within a tax-deferred account such as an IRA or 401(k), where trades wont trigger realized capital gains. For example, if youre overweight on U.S. stocks and light on international stocks, you could sell U.S. stocks and buy an international-stock fund in your 401(k). If you need to make changes in a taxable account, you can attempt to offset any realized capital gains by selling holdings with unrealized losses. That might be difficult, as the strong market environment has lifted nearly every type of asset over the past 12 months. Only a few Morningstar Categories (including India equity, real estate, consumer defensive, and health care) posted losses over the trailing 12-month period ended Oct. 30, 2025. The average long-term government-bond fund lost about 8% per year for the trailing five-year period as of the same date, so those could offer opportunities for harvesting losses. Required minimum distributions can also be used in tandem with rebalancing. Account owners have flexibility in which assets to sell to meet RMDs. If you own several different traditional IRAs, you could take the full RMD amount from any of them. Selling off holdings that appreciated the most can bring the portfolios asset mix back in line with your original targets. Another option is funneling new contributions into underweight asset classes. Depending on the size of additional investments, this approach might take time, but its better than not rebalancing at all. This might also appeal if youve built up capital gains you dont want to realize. Final thoughts Rebalancing is especially important in extremely volatile times. But even in a more gradual bull market like in recent years, its important for keeping a portfolios risk level in check, especially for investors as they approach retirement and start spending their portfolios. ___ This article was provided to The Associated Press by Morningstar. For more personal finance content, go to https://www.morningstar.com/personal-finance Amy C. Arnott, CFA is a portfolio strategist for Morningstar.


Category: E-Commerce

 

2025-11-25 17:10:00| Fast Company

The Trump administration left nursing off a list of “professional” degrees in a move that could directly limit how future nurses will finance their education.  Removing the profession from the list will have a major impact, after the passing of President Trump’s One Big Beautiful Bill Act, which introduced a cap on borrowing. As of July 1, 2026, students who are not enrolled in professional degree programs will be subject to a borrowing cap of $20,500 per year and a lifetime cap of $100,000.  However, professional degrees offer higher loan options, with the ability to borrow $50,000 per year and a $200,000 lifetime cap. ‘A backhanded slap’ Nursing is the largest healthcare profession in the United States, with about 4.5 million registered nurses. And given that most nurses (76%) rely on financial aid to pay for their education, the move has drawn immense backlash, as it’s being widely viewed as a slight against the profession. That’s especially true because nurses, who have a lengthy list of responsibilities, including providing frontline patient care, running lab work, assisting in procedures, and more, are often seen as one of the most essential pieces of the healthcare system.  Bassey Etim-Edet, a high-risk labor and delivery nurse in Baltimore who was on the front lines of care during the COVID pandemic, told Fast Company that the Trump administration’s move sets the wrong precedent and that the impact can’t be overstated.  “To go from ‘healthcare hero’ to not being recognized as a professional is such a backhanded slap,” Etim-Edet says,” especially at a time when legal precedent has made it clear that nurses are as responsible for provider mistakes as the providers themselves.” “We are disrespected, underpaid, and under-resourced,” she added. “Still, we serve.” Etim-Edet, who graduated with $150,000 in student loans, says her career wouldn’t have been possible without the HRSA Nurse Loan Repayment Program. “In exchange for working 23 years at a critical access hospital, the government paid back a massive percentage of my loans,” Etim-Edet explained. “At the end of my service commitment, my loan balance was down to about $60,000. I was able to buy a home, start a family, and live” because of the program. Fever pitch In response to the move, the American Nurses Association (ANA) launched a petition aimed at fighting the lower classification. It warned, “This move stems from an effort to rein in student loan debt and tuition costs as part of the One Big Beautiful Bill Act; however, it means that postbaccalaureate nursing students would only be eligible for half the amount of federal loans as graduate medical students.”  The petition continued, “We call on the Department of Education to revise the proposed definition of ‘professional degrees’ to explicitly include nursing.”  Amid the backlash, the Department of Education called concerns around the move “fear-mongering” by “certain progressive voices” in a lengthy statement released on Monday, November 24. “The definition of a ‘professional degree’ is an internal definition used by the Department to distinguish among programs that qualify for higher loan limits, not a value judgement about the importance of programs,” the statement reads. “It has no bearing on whether a program is professional in nature or not.”  It also noted that “95% of nursing students borrow below the annual loan limit and therefore are not affected by the new caps.” A spokesperson for the Department of Education referred Fast Company to the statement when reached for additional comment. Still, nurses seem to disagree.  At a time when healthcare in our country faces a historic nurse shortage and rising demands, limiting nurses access to funding for graduate education threatens the very foundation of patient care,” Jennifer Mensik Kennedy, president of the American Nurses Association, said in a statement. “In many communities across the country, particularly in rural and underserved areas, advanced practice registered nurses ensure access to essential, high-quality care that would otherwise be unavailable.” The Trump administration’s move comes as the nationwide nursing shortage is expected to continue to worsen. Etim-Edet adds that, as the system is already collapsing, younger people who greatly value work-life balance won’t want to work in a career that isn’t financially accessible or good for their emotional health. 


Category: E-Commerce

 

Latest from this category

25.11Inside the Trump administrations dicey play to block states from regulating AI
25.11All of ByHearts recalled baby formula may be contaminated with botulism, tests show
25.11Why shoppers may spend less this holiday weekeven with more deals
25.11Retail sales rose slightly in September as Americans pulled back on spending
25.11Why so many brands want to piss you off
25.11Consumer confidence falls to lowest level since April as Americans worry about inflation and jobs
25.11Attorneys general are fighting for states rights to regulate AI
25.11Rising unemployment rate for Black women sparks calls for change
E-Commerce »

All news

25.11Tomorrow's Earnings/Economic Releases of Note; Market Movers
25.11Bull Radar
25.11Aurora roofing company VP called Black worker slave, EEOC lawsuit alleges
25.11Historic downtown Elgin building to be transformed into a boutique hotel, cafe
25.11Americans are microdosing obesity drugs, driven by thin is in marketing blitz
25.11Inside the Trump administrations dicey play to block states from regulating AI
25.11What's at stake for Reeves's Budget?
25.11Incoming CEO is writing Targets next chapter, including largest store overhaul in a decade
More »
Privacy policy . Copyright . Contact form .