PERSPECTA

News from every angle

Results for “Big Tech companies

18 stories found

There's a lot at stake for the tech giants betting big on wearables
TechnologywsjBusiness InsiderTimes of India14d ago3 sources

There's a lot at stake for the tech giants betting big on wearables

Getty Images; Tyler Le/BI This post originally appeared in the Business Insider Today newsletter. You can sign up for Business Insider's daily newsletter here. AI's next target? Helping you kick your phone addiction. AI devices are a top priority for Big Tech companies that view it as the future of how humans and AI interact, writes BI's Amanda Hoover. You've likely heard of this hardware before, which acts as a sort of AI sidekick for your life. From the Rabbit R1 and Humane to Friend, the names are different, but the stories are the same: big expectations, difficult execution. Amanda's story covers how it's not just upstarts looking to shake things up. Tech giants like Apple, Meta, and OpenAI are working on their own solutions. It's an uphill battle considering how addicted most of us are to our phones. However, the push for phone-free lifestyles, especially among Gen Z, does create an opening. These tech giants also don't have much of a choice. Apple, for example, has largely sat out the AI wars, saving a ton of money on model development. That only works if the iPhone remains a key distribution channel for the AI it's skipping out on developing. Meta's business is also heavily reliant on smartphone usage. (How often do you check Instagram on your desktop computer? Do you even have a desktop computer?) If user behavior around phones changes in a meaningful way, you can bet Meta wants to be ahead of it. AI devices also give companies a front-row seat to your life. You could argue that's already the case with these AI chatbots. I'd argue the relationship between you and your chatbot of choice is still mostly transactional. You have a question/problem/thought; the chatbot has an answer (hopefully). The relationship with AI wearables is more fluid. It's always listening, learning, and collecting. The pitch is that makes it a better copilot. Understanding your habits means it can figure out the best way to serve you. That's putting a lot of faith, and your personal data, into an AI device, though Many executives I've spoken to have said this is the future. Truly leveraging AI is about incorporating it into your daily routine, not treating it as a one-off for specific problems. The irony is that strategy has the potential to make AI even more addictive than the smartphones it's trying to replace. But maybe that's the point. Read the original article on Business Insider

Google Deepmind CEO says the memory shortage is creating an AI 'choke point'
TechnologyBusiness Insider12d ago

Google Deepmind CEO says the memory shortage is creating an AI 'choke point'

Google's AI boss Demis Hassabis said the memory market came down to "a few suppliers of a few key components." PONTUS LUNDAHL/TT NEWS AGENCY/AFP via Getty Images Google DeepMind CEO Demis Hassabis said that the "whole supply chain" for memory chips is constrained. "You need a lot of chips to be able to experiment on new ideas," Hassabis told CNBC. Google produces its own TPUs, but Hassabis said that there were still "key components" that were supply-constrained. The memory shortage takes no prisoners. Even Google isn't immune. AI companies are duking it out for greater and greater quantities of memory chips. The problem? The industry is heavily supply-constrained. Costs have skyrocketed, products have been tied up, and some companies — especially those in consumer electronics — are increasing prices. On the AI front, Google DeepMind CEO Demis Hassabis told CNBC that physical challenges were "constraining a lot of deployment." Google sees "so much more demand" for Gemini and its other models than it could serve, he said. "Also, it does constrain a little bit the research," Hassabis said. "You need a lot of chips to be able to experiment on new ideas at a big enough scale that you can actually see if they're going to work." Researchers want chips, whether they work at Google, Meta, OpenAI, or other Big Tech companies, and memory is a key component. Mark Zuckerberg said that AI researchers demanded two things beyond money: the fewest number of people reporting to them, and the most chips possible. Hassabis said that wherever there was a capacity constraint, there was a "choke point." "The whole supply chain is kind of strained," Hassabis said. "We're lucky, because we have our own TPUs, so we have our own chip designs." Google has long built TPUs — Tensor Processing Units — for internal use. The company also leases them to external customers through its cloud, which has also put Nvidia on edge. But even access to their own TPUs won't save Google from having to navigate the highly competitive memory market. "It still, in the end, actually comes down to a few suppliers of a few key components," Hassabis said. Three suppliers dominate memory chip production: Samsung, Micron, and SK Hynix. These companies are struggling to meet demand for chips from AI hyperscalers without dropping their longtime electronics customers. It doesn't help that AI companies mainly want a different type of memory chip than PC manufacturers do. Large language model producers want HBM (high-bandwidth memory) chips. Don't expect Google's spending on AI infrastructure and chips to go down anytime soon. On its fourth-quarter earnings call, the company projected capital expenditures of $175 billion to $185 billion for 2026. Read the original article on Business Insider