Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 

Keywords

2025-10-30 09:00:00| Fast Company

As the largest art and design school in the United Stateswith nearly 17,000 students enrolled at its Savannah and Atlanta campusesthe Savannah College of Art and Design prides itself on offering a course of study for almost every type of creative person. Along with degree programs in animation, film and television, game development, graphic design, and illustration, SCAD tempts students with courses in beauty and fragrance, sneaker design, luxury and brand management, and equestrian studies.  There’s a new degree program this school year, in Applied AI, that is attracting a different sort of attention. As I learned directly from faculty, students, and industry veterans, and read on Reddit forums, the idea is enticing to someand prompting others to question the schools priorities and its very reason for existing.  Using AI Right The idea behind the Applied AI program at SCAD is to . . . . . . equip students with the knowledge and skills that employers want: SCADs website boasts that within a year of graduation, 99% of students are typically employed, pursuing further education, or both. According to SCAD, the Applied AI program will prepare students for professions including AI product developer, AI design strategist, AI story engineer, autonomous agent designer, and ethical design strategist. SCAD is also offering a minor in Applied AI thats open to students across all majors.  This is a researched best guess about a changing employment landscape, says Nye Warburton, chair of interactive design and game development at SCAD, who leads the Applied AI program. Our assumptions are that AI will create new domains in product development, design, story, and ethically focused systems. Our major is designed to develop the skills and practices that we see emerging in those fields. Guided by input from creative and design leaders in industry, the Applied AI curriculum was developed by Warburton in collaboration with SCAD’s curriculum and assessment team, the deans of the school of creative technology and the school of animation and motion, and various professors and department chairs. It has three pillars: story, action, and impact. We already have foundational and general education classes at SCAD, says Warburton, referring to offerings such as drawing and design thinking, math and English. So if you’re a writer or an architect, or whatever, you still need to have those fundamental understandings. The AI story classes, he stresses, are not about that kind of basic storytelling, but aim to help students build more resilience and understanding of their purposewhat they want to use AI foras they move forward.  The impact component of the program involves making sure students understand intellectual property, environmental issues, and other broad concerns about the use of AI. The last component of this is an impact test, currently in development, that students will have to pass after their sophomore year. We dont let you go to the higher levelsdesigning AI agents and doing capstone workuntil youve placed out of these civics classes. The action part of the program looks at workflow practices and application-specific ways of using them, says Eric Allen, associate chair of interactive design and game development and an instructor in the Applied AI program. Major game studios, for example, are already using AI models for generating and amalgamating ideas, he says. And, of course, they vibe code, building plugins that can help them in their workflows.  SCAD aims to engage faculty from across the school to teach these interdisciplinary classes, focusing on uses of AI that are most relevant to their fields. In addition to current faculty who will teach these cross-disciplinary classes, SCAD is in the process of hiring a dedicated Applied AI faculty.  One of our main goals for curriculum development is to embrace design and art workflows and push [them] further with applied AI learnings, Warburton says. It is an opportunity to invent new collaborative interdisciplinary things. And my hope is that thats where the jobs will bein the intersection between these disciplines, even if traditional disciplines become disrupted.  Disruption and de-skilling  Of course, AI is already disrupting fields like gaming, film and animation, graphic design, and copywriting. In a 2024 State of the Game Industry report from the Game Developers Conference, 84% of developers indicated that they were somewhat or very concerned about the ethics of using generative AI, which has already contributed to large-scale layoffs in the industry. According to an industry tracker, an estimated 14,600 people were laid off from game development positions in 2024, up from 10,500 layoffs in 2023.  Generative AI is an automation technology, full stop, says Reid Southen, a concept artist whose credits include movies The Hunger Games, The Woman King, and Matrix Resurrections. While he agrees that there are cases where AI tools can speed up certain tasks and potentially benefit artists, he says, “There’s no world in which it creates jobs. He and other artists are reporting less work and lower payand theyre increasingly asked by clients to fix up concepts originally created with AI. Adding insult to injury, image video-generating models such as Midjourney, Stability AIs Stable Diffusion, and Open AIs Sora are all facing lawsuits over the alleged use of copyrighted imagery in their development.  Another concern is that AI will lead to a broad de-skilling. As an artist, you make thousands of micro-decisions while making a piece of art, Southen says. Every brushstroke or click is a choice, and both you and the work evolve throughout that process. With AI, you give it a few large decisions about what you want, and it makes all those micro-decisions for you and fills in all the gaps. Warburton understands the threat to artistry and industry know-how. I think it is an obligation to vehemently defend the expertise that we have. The friction is, How do you use these new tools to augment these skills, as opposed to just prompting your way through it? I’m worried that we’re not going to have a senior level of talent in the future, he says, noting that junior-level designers wont learn how to make creative decisions. Enthusiasm varies For these reasons and more, enthusiasm for the new AI curriculum varies widly among SCAD faculty and students. Many in industrial design, UX, architecture, and interior design are proactively moving forward in AI, Warburton says. They have a clear idea of how they can integrate it into their processes. The business classes are really interested in how we do simulations and predictive models. I thought photography would be way against it, but a lot of the professors said theyve already been disrupted by digital, so theyre kind of ready for [AI].  On the other hand, he says, Im not a very popular human being in certain majorssuch as illustration and sequential art (comic books, graphic novels). What I hear a lot is, I cant make a graphic novel with AI because the publishing industry will never accept it. The schools animation department is ambivalent. The animation professors are really into it, because they see the possibility of the pipeline accelerations like rigging and compositing and different kinds of rendering. However, the students are very against it, Warburton says, noting that students express concerns about the use of artist data in training generative models, and issues with AIs environmental impact. However, I personally believe that automation anxiety is the root. Since the program was announced this fall, four students have declared Applied AI as their major, and 25 have declared it as their minor. (The first official Applied AI class, AI 101featuring exercises in developing a personal story, and a crash course on LLMs and image modelsstarts this winter.) The school expects the number to increase significantly next academic year. A cold day in hell As always, the most honest discussion seems to be happening on Reddit. The general consensus among students here is that the ai major is a joke. and so are the bots studying it, writes Redditor @sunadherstars. I just graduated from the SEQA [sequential art] program like this past quarter and let me tell you: itll be a cold day in hell before they ever push AI, writes @electricaaa. The entire department is extremely against it, no matter how much the administration tries to push it.  Still, theres no doubt that for companies hiring creative talent, familiarity with AI is high on their wish list. Ami Frost, who will graduate from SCAD with a BFA in industrial design in December, reports that thats a recent shift. Some friends have actually gotten jobs because they know AI. I think learning how to use AI is definitely going to be a good stepping point in your career, like it or not. Taking a few extra classes to get better at AI makes sense, says a recent SCAD graduate in UX and industrial design, who preferred not to give their name in order to be able to speak more freely. But a degree drastically impacts the trajectory of your life. Would you want to choose something based on the hottest trend? Imagine if [SCAD] had started offering a major in NFTs when those were the next big thing. I think AI is too new to really have had the time for a robust education on how to use it to emerge. Personally, I’d rather wait for a boat to be fully constructed before taking it for a four-year ride down the Nile. SCADs critics have often felt that the school operates more like a business than a traditional academic institution (even though it operates as a nonprofit). For the fiscal year ending June 2024, the school reported revenue in excess of expenses of more than $220 million, and president and founder Paula Wallace received compensation of more than $2.6 million. (Yearly undergrad tuition and fees, minus room and board, is currently $42,665.) But while it may be tempting to write off the Applied AI degree as a slapdash money grab, its worth noting that SCADs rivals are increasingly AI-curious. ArtCenter College of Design in Pasadena, California, and Rhode Island School of Design in Providence both offer classes that integrate AI. Last year, Ringling College of Art and Design in Sarasota, Florida, launched an AI certificate program similar to SCADs minor.  Perhaps todays new generative AI tools will become like once-novel programs such as Adobe and CAD, absorbed seamlessly into the design process. If they take designers jobs, though, or thwart students ability to learn the basics, there may be no one left to use them.


Category: E-Commerce

 

2025-10-30 08:30:00| Fast Company

Below, Paul Leonardi shares five key insights from his new book, Digital Exhaustion: Simple Rules for Reclaiming Your Life. Paul is a professor of technology management at the University of California, Santa Barbara. He is a frequent consultant and speaker to a wide range of companies, such as Google, Microsoft, YouTube, McKinsey, GM, and Fidelity. He is also a contributor to the Harvard Business Review. Whats the big idea? We are the first generation in human history to carry the entire worlds information, connections, and distractions in our pockets. Its no wonder that the technology once promised to make life easier now leaves us tired and overwhelmed. Paul Leonardi refers to this as cognitive and emotional weariness, calling it digital exhaustion. But it doesnt have to be this way. With intention, we can turn our devices from sources of drain into tools of connection, empowerment, and creativity. Listen to the audio version of this Book Biteread by Paul himselfbelow, or in the Next Big Idea App. 1. Exhaustion isnt weaknessits physics Maya wakes up at 5:50 a.m. to her phone buzzing. Within seconds, she is scrolling on Instagram. A news alert pops upshe clicks it. Three text messages arriveshe switches to a different app. Her partner tries to talk to her, but she doesnt hear him because now shes checking WhatsApp. By the time Maya gets out of bed, she has made dozens of micro-decisions and context switches. Her brain has already started burning through its limited energy reserves, and she hasnt even had coffee yet. This routine is a growing epidemic. In my research tracking over 12,000 workers across 12 countries for two decades, I found something startling. In 2002, the average digital exhaustion score was 2.6 out of 6. By 2022, it had skyrocketed to 5.5. My nine-year-old daughter looked at the graph and said, It looks like a snake about to strike. She was right. And that snake has struck. Think about a phone battery. When its new, you charge it overnight, and it lasts all day. But after thousands of charge cycles, it drains faster and faster. Eventually, it barely holds power at all. Our brains work the same way. Every notification, message, and screen switch drains our cognitive battery. Take Andi, a product manager I worked with at a software company. She told me, I feel like my brain is a phone battery that just doesnt hold a charge anymore. I used to be sharp all day. Now by 2 p.m., Im staring at my screen, unable to focus on even simple tasks. Andi wasnt weak or lazy. She was experiencing what neuroscientists call cognitive depletion. Every time you switch your attention, blood rushes to your prefrontal cortex. Your brain sounds a two-tone alarm, searches for the right neurons to handle the new task, then activates them. This process, called rule activation, happens in tenths of a second. Each switch burns precious metabolic energy. Every notification, message, and screen switch drains our cognitive battery. Your brain accounts for only 2% of your body weight, but consumes approximately 20% of your daily calories. Most of that energy is devoted to keeping your body running: breathing, heartbeat, and temperature regulation. Only a tiny reserve is left for active thinking. As neurologist Richard Cytowic told me, The brains reserve margins are slim and quickly eaten up by constantly shifting attention. I tracked 20 teams at three Fortune 500 companies and found that the average knowledge worker toggles between apps and websites 1,200 times per day. Thats 1,200 energy-draining attention switches. If each switch takes just two seconds, thats 40 minutes a day just transitioning between tools. But the real cost isnt time, its the cumulative exhaustion from all that switching. Since ancient Greece, writers have described exhaustion as the depletion of a finite resource. Today, that resource is attention. Just like muscles grow sore from physical labor, minds weaken when overtaxed by constant switching, scrolling, and interpreting. The problem is that our bodies are great at signaling physical fatiguesore muscles, aching jointsbut our brains rarely wave a red flag. Instead, exhaustion sneaks up on us. We dont realize weve crossed the line until were already depleted. Stop blaming yourself. If you feel exhausted by digital tools, youre not failing. Its the inevitable physics of finite energy colliding with infinite digital demands. 2. Tools multiply faster than our capacity We often assume the solution to digital overload is better tools. If I just had the right app, we think, everything would be easier. But the more tools we adopt, the more fragmented our lives become. Consider HealthCo, a global company I studied. When I asked employees to list their digital tools, I found that they werent just using email anymore. They had Slack for chat, Zoom for meetings, Jira for project management, Salesforce for customers, SharePoint for documents, Teams for cross-department updates, plus dozens of specialized tools. Each promised to make work easier. Together? They created chaos. One employee, Marcus, told me, I spend more time figuring out where to respond than actually responding. Did that question come through email? Was it on Slack? Did someone tag me in Jira? Im constantly hunting for conversations across platforms. My research revealed that the average knowledge worker spends 57 minutes per day switching between applications. They also spend 30 minutes daily deciding which tool to use for each task. Should I message on Slack or email? Should I put this in Notion or Google Docs? These micro-decisions seem trivial, but when youre making them 1,200 times a day, they create what psychologists call decision fatigue. Let me tell you about Shireen, a marketing specialist who listed 36 different digital tools she used daily. Everything from Adobe Illustrator to TikTok to her banking app. After our conversation, she looked at her list and said, Oh God. That exhausts me just looking at it. The average knowledge worker spends 57 minutes per day switching between applications. Shireen decided to try an experiment: she cut her tools in half. She identified tools that were redundant (using both Alexa and Siri), tools she barely used (Evernote for notes when she had Word), and tools that werent essential (three different social media apps for posting the same content). Six months later, her exhaustion score dropped by 40%. She told me, I thought Id miss those tools, but instead I feel liberated. Its like I decluttered my digital life. Technology expands faster than our cognitive capacity to manage it. Were running software from 2024 o hardware (our brains) that hasnt been upgraded in 300,000 years. Exhaustion isnt built from volume alone, but from fragmentation across too many platforms. 3. Interpretation is more draining than information Its not the flood of emails and notifications themselves that exhaust us most. Its what I call inference fatigue, meaning the mental work of constantly interpreting ambiguous digital communication. Take Aaliyah, who runs a nonprofit in Georgia. Her inbox was overwhelming, but what really wore her down was the constant interpretation. She told me, I spend so much mental energy trying to decode messages. Why did my colleague only reply with okare they annoyed? Why hasnt the donor respondeddid I offend them? What does that emoji really mean? Its exhausting trying to read between the lines all day long. Digital communication strips away crucial context: tone of voice, body language, facial expressions. Our brains work overtime to fill in the gaps. This creates what neuroscientists Susan Fiske and Shelley Taylor call controlled processing, or slow, deliberate, effortful thinking that demands huge amounts of cognitive resources. Researchers examined college students responding to ambiguous Facebook posts. Not only did students feel mentally exhausted trying to figure out if someone was genuinely depressed or just seeking attention, but the emotional toll of arriving at these conclusions left them feeling depleted and dispirited. Think about your own digital life. How often do you reread a text, worrying about its tone? How many times have you analyzed a brief email response, wondering if the sender is upset? These micro-interpretations pile up until were mentally drained. In the analog world, a face-to-face conversation offers dozens of cues that help us interpret quickly and accurately. A slight smile, a furrowed brow, the speed of responseall provide instant context. In the digital world, every interaction becomes a puzzle to solve. These micro-interpretations pile up until were mentally drained. But it gets even more complex. Were not just interpreting others; were also constantly evaluating ourselves through digital mirrors. Take Zoom fatigue. Jeremy Bailenson at Stanford found that seeing ourselves on video calls creates unprecedented self-scrutiny. As he put it, Imagine in the physical workplace, for the entirety of an eight-hour workday, an assistant followed you around with a handheld mirror. Thats essentially what happens on Zoom. One of my students, Xiao, confessed to me, When youre teaching on Zoom, Im constantly managing how I appear. Am I nodding enough to show Im engaged? Can you see Im taking notes? I was so busy thinking about how I looked to you that I missed half of what you were saying. Exhaustion isnt just about too much input. Its about the invisible, constant work of decoding others meanings and managing our own digital presence. 4. Boundaries create energy, not limits We tend to think of boundaries as walls that keep us from being responsive and connected. But my research shows the opposite: boundaries create energy. Consider Vicente, a high school teacher and soccer coach I studied. He made one simple change: he turned off all notifications during soccer practice. No emails, no texts, no pings for two hours each afternoon. At first, I was anxious, Vicente told me. What if I missed something important? But then something magical happened. I was fully present with my players. I could see their technique improving, and have real conversations with them. And when I checked my phone after practice, I had more energy to deal with messages because I wasnt depleted from constant interruptions. Boundaries are like fences around your attention. They dont shrink your world; they protect the space where energy regenerates. In one study I conducted with companies using internal social media platforms like Jive, employees who set specific times to check messages improved their ability to find expertise by 31% and identify key contacts by 88%. Why? Because when they werent constantly interrupted, they could pay attention to patterns in communication and learn who knew what in their organization. But many of us resist boundaries. We worry well miss something critical or appear unresponsive. Jed, an investment banker I interviewed, prided himself on responding to every message within minutes. Im dependable, he told me. People know they can count on me for fast responses. We worry well miss something critical or appear unresponsive. So, I interviewed his colleagues. Most didnt even notice his quick responses. His boss actually wished hed slow down: Sometimes Jed responds too quickly without thinking things through. Id prefer more thoughtful responses, even if they take longer. Jed was exhausting himself to maintain a reputation that didnt exist. His always-on approach scored him a five out of six on the exhaustion scale, while providing no real benefit to his colleagues. Boundaries are about saying yes to energy, focus, and genuine connection. When we protect our attention, we create energy. 5. AI could save us . . . or exhaust us completely Let me present two scenarios based on my current research with ten companies that are implementing AI. AI as liberator. Imagine an AI assistant that truly understands your work patterns. It filters your inbox, showing you only the three decisions that need your input today. It summarizes those lengthy Slack threads into two sentences of relevant information. It drafts routine responses that sound like you, leaving you free to focus on creative, meaningful work. Im seeing glimpses of this at companies like HealthCo, where junior lawyers use AI to review contracts. Tasks that took days now take hours. They use their freed-up time to do higher-level work: strategic thinking, relationship building, creative problem-solving. AI as amplifier of exhaustion. My students recently used ChatGPT to write thank-you notes to a guest speaker. The AI expanded their brief thoughts into lengthy, formulaic essays. The speakers response? If youd sent me all these AI-expanded notes, I wouldve just fed them back into ChatGPT to summarize them. Think about the absurdity: Were using AI to expand our content, forcing recipients to use AI to compress it back down. Its an exhaustion amplification loop. Im already seeing this at scale. One executive told me 70% of internal documents at his company are now AI-generatedlonger, more verbose, but not more valuable. As he put it, Were drowning in synthetic content that no human actually wants to read. The scariest part? AI is getting better at keeping us hooked. In one study, personalized messages crafted by ChatGPT were remarkably effective at persuading people. AI can recognize our emotional states better than many humans, and it never gets tired of talking to us. A company I consulted for developed an AI sales assistant specifically designed to keep customers on the phone longer. As their product chief told me, Or AI figures out the perfect next thing to say to maintain engagement. Its like a conversation that never wants to end. If we design and use AI thoughtfully, it can reduce exhaustion by handling routine tasks and freeing us for meaningful work. But if we let it run wild, it will accelerate our exhaustion to unprecedented levels. AI is either the beginning of the end of digital exhaustionor exhaustions final evolution. The choice is ours. Enjoy our full library of Book Bitesread by the authors!in the Next Big Idea App. This article originally appeared in Next Big Idea Club magazine and is reprinted with permission.


Category: E-Commerce

 

2025-10-30 08:00:00| Fast Company

Some 99% of hiring managers in the U.S. say theyve used AI in some form during the hiring process, a 2025 report reveals. AI can whiz in and speed up cumbersome workflows (or make them disappear altogether). But after Fast Company spoke to several hiring managers and chief human resources officers to understand how HR is using AI to hire today, it became clear that for every benefit that AI offers theres a human cost. In this piece paid subscribers will: Get a step-by-step guide outlining how AI is reshaping hiringand who gets jobs. Learn what HR is doing to ensure hiring remains as fair as possible across the workforce. What job seekers can do to maximize their chances of landing the position. 1. Writing job descriptions, scheduling interviews Most jobs come with repetitive tasks that can be easily automated, and hiring is no different.  According to the 2025 AI in Hiring report from international staffing firm Insight Global, which surveyed more than 900 workers in the professional services industries, 75% of hiring managers use AI to schedule interviews, 54% use it to write job postings, and 53% use it to take notes during the interview and to draft emails to candidates. At Zillow, using AI to book interviews reduced the time it takes to schedule an interview with a candidate from 19 hours to 30 minutes, a 97% reduction, the company says. Meanwhile, documenting interviews with AI saved the team 33 hours per quarter.  Bosses maintain that these are all good things. Our team was freed up to do more strategic thinking, says Roz Harris, VP of talent acquisition, engagement, and belonging at Zillow. AI removes the administrative no-joy work, says Mary Alice Vuicic, chief people officer at Thomson Reuters, where AI is used to help write job descriptions, screen résumés, schedule interviews, and transcribe meetings. Companies are also building AI chatbots to help answer candidates questions as they navigate the hiring process. At Genpact, an IT consulting firm, the Genpact Engage chatbot has handled a million-plus questions across 30 countries to date. It has also nudged candidates to finish their applications, improving completion rates by nearly 50%. However, this has come with a cost.  Harris points out that as her team of 15 built these tools at Zillow, they knew they were making their own work obsolete.  Theres no hiding that the tool were asking you to design could replace you, she says. Zillow worked with team members to re-skill them and ensure they could still work at the company, despite AI taking over a lot of their duties, Harris adds. But in a world where workers may be being replaced by AI, an HR team designing themselves out of jobs speaks to the reality we live in.  2. Screening résumés The vast majority of companies today are also using AI to screen résumés.  According to a report from Harvard Business School, more than 90% of companies are using an automated system to filter or rank middle- and high-skilled job candidates. And yet? The same report mentioned 88% of companies say qualified high-skill applicants are filtered out because they dont match the criteria in the job description. HR management company Workday uses AI to screen candidates for jobs. Its currently facing a collective-action lawsuit by applicants who allege the algorithms discriminated against them because of age, race, and disability.   Many of the experts Fast Company spoke to pointed out that human resources departments are deluged with job applications, in part because AI has made it easier than ever to apply for jobs and spam hiring managers: DealBook reports that applications with AI-generated résumés submitted on LinkedIn have increased 45% this year. Overwhelmed HR departments have little choice but to fight fire with fire.  Candidates are using AI and AI agents to apply for thousands of jobs, so the candidate funnel has exponentially increased, says Ali Bebo, chief human resources officer (CHRO) of educational assets company Pearson. Cognizant, an IT services provider, is developing an AI screening system that will score and rank résumés with the aim of rolling it out in December. The AI will fast-track qualified candidates, allowing them move through validation steps and get to an interview with a hiring manager more quickly.  Kathy Diaz, Cognizants chief people officer, is careful to point out that the company is doing a number of testing and quality checks on the AI. We want to ensure were not missing candidates that we shouldnt and were selecting the right ones. We dont want to waste anyones time, she says. The hope, she says, is to eventually save enough time to be able to offer coaching and feedback to candidates who dont make the cut. Still, while several hiring managers say they use AI to screen résumés and provide a recommendation, the final decision, they contend, is up to humans.  Absorb Software, an AI-powered learning platform provider, uses AI to rank candidates résumés. But humans review each application to make sure the AI didnt miss anything; highly ranked résumés get a couple of minutes of human review, while lower-ranked résumés get about 30 seconds. Not all companies are on board with automating the process. Zillow and Pearson refrain from using AI to screen résumés. “We’re not using AI to screen candidates, per se, because there are some challenges that are out there, some issues that have been out there where folks have been sued,” says Bebo of Pearson. “We’re quite cautious with using AI to do the screening for us.” Résumé screening rewards candidates whose experience matches the job description, but it can miss the candidates with unconventional backgrounds who might be potential stars. For example, an econ doctorate with a popular blog might make a great personal finance columnist for a media company, but an AI might skip over that person if they have no newsroom experience.  To avoid falling through the cracks, Cheryl Yuran, CHRO at Absorb Software, recommends that candidates provide detailed résumés. While a one-page résumé has been standard for years, she says candidates can go up to two pages and should also include a cover letter. We went through a phase where the shorter the better because people were reading every résumé, she says.  Now with AI reading it, you want to include a decent amount of description to show your full experienceso AI has enough data. 3. Interviewing So far, companies that use AI to conduct interviews are in the minority, but perhaps not for long. According to a 2024 Resume Builder study, about a quarter of companies use AI to conduct interviews, and another 19% plan to do so within the next year. Brian Jabarian, a researcher at the University of Chicago, conducted an experiment analyzing the results for 70,000 candidates who interviewed for a customer service position. Candidates were interviewed by a human, AI, or given a choice of interviewing with a human or AI. Overall, Jabarian found that the AI interviews offered a more consistent experience: AI interviewers had a 50% chance of covering 10 of 14 required topics, compared to 25% for human interviewers. He also found that candidates interviewed by AI were 12% more likely to get a job offer. PSG Global Solutions, the company that built the AI bot that Jabarian used in his experiment, is planning to roll out its AI interviewer in 80 countries with more than 5 million candidates in early 2026.  We wanted to do this earlier, but this represents a major process change and has a significant impact on the candidate experience, says David Koch, PSGs chief transformation and innovation officer. We wanted to be sure it worked as intended and to fully understand its effects, so we began with a pilot and asked Dr. Jabarian to conduct an independent large-scale field study to evaluate its impact. Despite the encouraging outcomes, Koch still recommends AI interviewing only for specific cases: high-volume, high-turnover jobs for which applications are pouring in. AI interviews are not a good fit when you have to sell a candidate on the job, or you need specific talents and fit such as a senior leadership position, he says.  (As someone who asked to try an interview with PSGs AI, Kochs statement struck a deep chord, as I found the AI competent but soulless.) Currently, Cognizant uses AI to make screening calls that check a candidate’s availability and interest. AI can handle 300 screening calls in one hour compared to a recruiter, who can go through four to five calls, while producing the same success rate as a human recruiter. While some companies are using AI to interview, the vast majority of those Fast Company we reached out to drew a hard line at using AI for interviewing.  Safe Software, a Canadian company that helps organizations manage their data, notes that its critical to keep a human in the loop for interviews. We recently brought a new employee on board with multiple offers. She [said] the fact we did not utilize any AI interviewers was a big reason [why] she chose Safe, Bonnie Alexander, the companys chief human resource officer, writes in an email. Our human-centric approach to the hiring process spoke to her values, which is exactly the feeling we want to maintain as we continue to expand.  Not all job seekers are a monolith, though. Per Jabarians study, 78% of job seekers selected AI when given a choice between a human or an AI interview. Jabarian believes this is because AI interviewing allowed candidates to schedule interviews at their leisure. 4. Onboarding According to the Insight Global report, 50% of hiring managers are using AI to handle onboarding. Onboarding is ripe for disruption: In a 2025 study, 48% of more than 1,000 employees surveyed said a bad onboarding experience made them want to leave their job within six months, and only 28% of employees said their onboarding process prepared them for their job. At Atlassian, HR professionals built Nora, an AI agent that onboards new employees, which was rolled out in February of this year. When a new employee signs on, Nora shares information relevant to their role as well as the specific tasks they need to complete. Within the first month, Nora completed 2,000 hours of work answering questions from new hires and is now one of the most highly used agents at Atlassian.  Going forward, Atlassian is working toward using AI to make the onboarding journey a one-click proceess. New hires will be able to start by clicking on Atlassian’s AI-onboarding hub, which will automatically assign new tasks and training on Atlassian’s tools and products. At Cognizant, AI handles onboarding for new hiresensuring that forms are filled out properly and fielding routine questions. One of the priorities Diaz has for the time AI saves is to reinvest it in improving the new-employee experience. We arent going to spend less time recruiting, were going to spend different timemore time speaking with candidates about our culture and matching them up with buddies and mentors to make sure they have a good experience, she says.  She points out that ideally onboarding should be a long experience where new employees get a 30-, 60-, 90-day, and one-year check-in, and HR can ask how things are going and if there are any gaps. Were not just automating what were doing, she says. We want to totally reimagine it.


Category: E-Commerce

 

2025-10-30 08:00:00| Fast Company

In these volatile times, how do we navigate the intersection between values and commerce? Patagonia CEO Ryan Gellert and Chobani CEO Hamdi Ulukaya join New York Times reporter David Gelles onstage at the Masters of Scale Summit to reveal their different strategies for dealing with an activist White House, the pressure for what moderator Gelles calls “anticipatory compliance,” and how they grow their businesses while also prioritizing causes like environmental conservation and immigration.  This is an abridged transcript of an interview from Rapid Response recorded live at the 2025 Masters of Scale Summit in San Francisco. From the team behind the Masters of Scale podcast, Rapid Response features candid conversations with todays top business leaders navigating real-time challenges. Subscribe to Rapid Response wherever you get your podcasts to ensure you never miss an episode. Gelles: Ryan, Patagonia is a very different sort of company. This company has been meddling in politics, sometimes quite loudly, for more than 50 years now. Gellert: Meddling is a strong word. Meddling . . . oh, that’s a gentle word. [Patagonia founder] Yvon Chouinard has been donating to grassroots environmental activist campaigns for more than 50 years. In 2017, Patagonia sued President Trump and his cabinet during the first . . . Gellert: Yeah, that’s meddling. That’s meddling. And you still, even at a moment when most CEOs are afraid to say anything about this administration, you’re still out there raising the alarm almost every single week, it seems. When your business is selling clothes, why do you spend so much time talking about politics and policy? Gellert: Well, first of all, it’s super interesting being on the stage with you, Hamdi, and with you, David. I think you made reference to it. You wrote a book that’s just come out in the last month about our founder and our 52-year history. You and I have gotten to know each other quite well over recent years, and I think there’s a lot that we have spiritually in common as companies. And then I think, to the nature of your question, to each of us, there’s a lot that’s actually quite different in how we navigate that. I think for us to now answer your question, we are focused on protecting the natural world. Period. That’s why we exist. It’s not about making money. It’s not about being the biggest player in outdoor apparel and equipment. It’s about protecting the natural world. And so that’s what we do, and we exist in a world right now, here in America, where the threats are absolutely unprecedented. And I think that what you might describe as speaking out, I just think is telling the [expletive] truth about what’s going on in the world right now. Okay, so the climate’s at risk, pollution and polluters, the regulations are coming off, and conservation, particularly public lands here in America. I mean, it is one goddamn threat after another, every single day. And so what are we speaking up on? Those things that matter. Same things weve spoken up on for 52 years. As the CEO of a company and as an individual, do you ever worry about the fact that this is a moment and an administration that has shown a willingness to be retributive? Gellert: Yeah, of course. And how do you navigate that? Is there any, I mean, the word is anticipatory compliance? Are you holding back at all? I think we have to be very strategic. I think we have to be very considered. I think what we talk a lot about is, where do we have authenticity to offer an opinion on something, and where can we be truly additive? If it’s performative and we’re just offering an opinion to offer an opinion, that’s not a space we’re going to play in right now. I don’t think the times benefit from that. I think where we can be truly authentic is in one or two places. One is we’re a business, and so we can speak from the business sector. And the other is on environmental and climate issues. We’ve got a 52-year history. We work our asses off to minimize our footprint. As you made reference to, we’ve supported grassroots activism for 40 years and counting. And so that’s who we are, what we do, and I think we’ve earned the right to offer opinions on that. You get a sense of the different approaches to really a very similar and consequential set of issues. We’re going to talk about more than politics, I promise, but I do want to come back to this issue of, Hamdi, how you navigate a moment like this, and when you decide to work with Ivanka Trump, even when you decide to work with the White House. It can seem like a no-win situation. You work with someone, you piss one side off. You say something, you piss the other side off. How do you think about engaging in these partnerships where you are trying to find common ground without alienating any of your consumer base? Ulukaya: Yeah, look, what I do and what we do at Chobani is really, we have the known instinct or reflex that we react regardless of what the world thinks and all that kind of stuff. Ivanka actually did not start now. I worked with her in Idaho after President Biden got into the White House, actually. And what we did is, we made boxes of food from the farmers, and we delivered to people in need in communities at that time. And later on, even before the election, she and her partner, they created this organization called Planet Harvest, and she says: “Do you realize that in California, 40% of all the fruits and vegetables are wasted and left in the land because they don’t look good and there’s no buyer?” And I couldn’t believe it. I am aware of these things, but I didn’t even realize. And I went to the land and I saw, and partnered with her and her partners who had studied quite knowledgeably. Absolutely, we’ll do it. Absolutely, we’ll do it. I’ll invest with it, and I’ll lead it and improve the concept. So to me this . . . and when I said we are going to hire refugees during the first, I don’t know how many years ago, we got death threats and boycottsall kinds of stuff like that. The first time I wrote about your company. Ulukaya: You wrote it in The New York Times, and we got death threats. We all have to react as human beigs, who we are. And businesses are a combination of people. You’ve got to do the right thing regardless of what lawyers and communication experts will say. On the advisor council stuff right now. Ulukaya: I want to invite everybody towe have some serious, serious issues that we have to bring everybody to the table to. Look, we really do. I do see some egocentric reaction . . . just anger because of the other person. Okay, they are enormous about the differences. I don’t know. I was invited to the White House because I’m announcing a huge investment in Idaho and another one in Rome in New York, and being part of Invest in America. I don’t have any working relationship with the White House, but my view on immigration and refugees is the same. I have an organization called Tent. I just came from Mexico. I’m meeting all those people who are encouraging people to hire refugees and train refugees. These are timeless truths. People are going to move, and we have to make a system that works for every single person. And we proved it in our factories, in our communities. And today, you will not have farmworkers, or you will not have functioning farms and agriculture, without immigration. Everybody knows that. Everybody.


Category: E-Commerce

 

2025-10-30 07:00:00| Fast Company

The latest buzzword is AI literacy. Much like social media, ESG, and CSR before it, employers are now looking for proof of fluency on résumés, and individuals are desperate to differentiate themselves to show that they are keeping pace.  And its everywhere, mentions of terms like agentic AI, AI workforce, digital labor, and AI agents during earnings calls increased by nearly 800% in the last year, according to AlphaSense data. Over the last five years, workers across industries have become expected to be well-versed in a technology that is ever-evolving and still relatively new for so many, including the leaders implementing it. The trouble with AI is that by the time a candidate hits send on a CV, their level of proficiency is already outdated.   It’s a quiet, corrosive force that’s keeping people silent in the very moments when we need their voices most. But what if the real problem isnt the pace of change or people not understanding AI, but instead that we have made them feel ashamed for their lack of understanding, preventing people from raising their hand to say, “I don’t know? Vulnerability makes us human. Mark Cuban recently posted on X, The greatest weakness of AI is its inability to say ‘I don’t know. Our ability to admit what we don’t know will always give humans an advantage. Why, then, are we creating an environment and fostering workplace cultures that encourage people to fake it, until you make it” as it relates to AI? The cost of staying quiet is real. We’re at risk of shaming ourselves into obscurity.  The Shame Spiral in Action Everyones talking about the AI hype cycle. But almost no one is talking about the shame spiral its creating. AI not only has a long-term impact on the economy, but also on the day-to-day lives of people. Companies are replacing roles faster than theyre training workers and in some cases, like Klarna, laying off workers only to hire back when AI tools fall short. People miss out on jobs, not because theyre unqualified, but because no one gave them a path forward. They walk around feeling like impostors in rooms they’ve already earned the right to be in. Inside companies, we see biased tools get approved and shortcuts turn into systems.  A recent report by LinkedIn shows 35% of professionals feel too nervous to talk about AI at work, and 33% feel embarrassed by how little they know. These aren’t just workers, they’re parents and community leaders.  This shame spiral, fueled by hype that says “everyone gets AI except you,” risks shutting down curiosity and critical questions before they even start. The pattern signals a bigger issue: at the same time people feel too ashamed to engage, AI systems are taking over and making decisions, incremental and important, that affect everyone. To avoid embarrassment, people take shortcuts.  A recruiter might rely on an AI résumé screener without understanding how it works and which candidates it may be discarding. A manager might approve a tool that decides who gets extended care without asking what drives the algorithm. A parent might sign off on an AI-powered teaching tool without knowing who designed the curriculum. A 2024 Microsoft and LinkedIn survey found that only 39% of people globally who use AI at work have gotten AI training from their company. We’ve seen what happens when these systems go unchecked. Amazon scrapped its AI recruiting tool after it was found to discriminate against women. Workday faces a class-action lawsuit alleging its AI screening tools systematically exclude older workers and people with disabilities from job opportunities. Microsoft’s chatbot Tay launched with the intention of learning from conversations, was exposed to trolls, and within 24 hours, was posting racist, misogynistic, and offensive content.  When silence replaces curiosity, people essentially remove themselves from the decision-making process until they are no longer accounted for. Reshaping The Workplace Reality AI is here, and it is changing the workforce. The choice is ours: Bring people along with us and help them be part of the transformation or leave them behind in the name of efficiency?  What moves people from anxiety to agency isn’t more lectures or tutorials. People are inspired by permission and tools. Permission to be a beginner. The freedom and the space to learn. The most confident AI users aren’t experts; they play with different tools until they find what works for them. Digital dignity starts with that permissionpermission to ask basic questions, to slow down, to admit gaps. It means leaders modeling vulnerability before demanding employees fill theirs.  To truly embrace and harness the potential of AI, we must focus on impact, not mechanics. You don’t need to code a neural net, but you do need to spot when AI systems are making decisions about you. Start with what affects you directly: parents can ask what tools schools are using, job seekers can learn how résumé screening works, and managers can ask what AI tools are coming into their workplaceand what training comes with them. Practice saying “I don’t know.” The best leaders see gaps as opportunities to ask good questions. JPMorgan created low-stakes spaces for managers to experiment with AI, encouraging leaders to admit when they were stuck. That openness built trust and sped up adoption. Johnson & Johnson encouraged broad experimentation across business units, generating nearly 900 AI / generative AI use cases across research, supply chain, commercial, and internal support. The result? An internal chatbot for employees and a fresh approach to making clinical trials more representative.  This isn’t just a knowledge gap. It’s a culture of silence. And if we don’t break it, AI won’t be a tool for transformation; it’ll be a mirror for all the systems we were too ashamed to question. The most powerfl thing we can say in this moment is: “I don’t know. But I want to learn.” Because the future is still being written, and we all deserve a seat at the table and a hand on the pen.


Category: E-Commerce

 

Sites : [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] next »

Privacy policy . Copyright . Contact form .