Normal view

Received before yesterday

The RAM crisis is Apple's best chance in decades to capture the PC market

In the current RAM crisis, no company is better positioned to not only weather the storm but turn it to its advantage like Apple. It proved that when it released the MacBook Neo in early March. Despite only including 8GB of RAM, the Neo doesn't feel compromised, a testament to the company's silicon and software engineering. For Apple, it may be tempting to treat its latest MacBook as a one-off. That would be a mistake, because at this moment, the business decisions that made the Neo possible represent a once-in-a-generation opportunity to become a bigger player in the PC market. 

If you read Engadget, there's a good chance you know the contours of the global memory shortage, but it's worth repeating just how bad things have become in recent months. Just three companies — SK Hynix, Samsung and Micron — produce more than 90 percent of the world's memory chips. At the end of last year, Micron announced it would end its consumer-facing business to focus on providing RAM and other components to AI customers.  

Citing data from TrendForce, The Wall Street Journal reported in January that data centers would consume 70 percent of the high-end memory produced in 2026. As the Big Three shift more of their production to meet enterprise demand, they're allocating fewer wafers for consumer products, leading to dramatic price increases in that market segment. According to data from Counterpoint Research, the price of memory — including consumer RAM kits and SSDs, as well as LPDDR5X memory for smartphones — increased by 50 percent during the final quarter of 2025. Before the end of the current quarter, the firm predicts prices will increase by another 40 to 50 percent, and the CEO of SK Hynix recently warned shortages could last until 2030

Since nearly all consumer electronics need some amount of RAM and storage, the trickle-down effects have come fast and hard. In December, before the situation got as bad as it is now, TrendForce warned that most of the major PC manufacturers were either considering, if not already planning, price hikes. This month, the firm warned laptop prices could increase by as much as 40 percent if manufacturers and retailers moved to protect their margins. Such a scenario would send the cost of a $900 model to about $1,260.

Amid all that, Apple added another point of pressure: the $600 MacBook Neo. During a recent investor call, Nick Wu, the chief financial officer of ASUS, described the Neo as "a shock to the entire market," adding "all PC vendors, including upstream vendors like Microsoft, Intel and AMD" are taking the cute device "very seriously." Wu warned ASUS would "need more time" before it could ready a response.         

For ASUS and other Windows manufacturers, any response realistically may take a year or more to formulate. That's because the Neo represents both a technical and logistical hurdle. 

To start, it's a fundamentally different machine from the one most Windows OEMs are making right now. It has the advantage of using "unified memory" instead of a set of traditional RAM modules. The 8GB of RAM the Neo has is shared between the A18 Pro's CPU and GPU, meaning it can more efficiently use the RAM that it does have. That's part of the reason the Neo doesn't feel like a Windows PC with 8GB of RAM. Apple didn't get to the A18 Pro and the MacBook Neo by accident. It has spent more than a decade designing its own chips. 

Since 2024, Microsoft has mandated 16GB of RAM — and 256GB of solid-state storage — for PCs that are part of its Copilot+ AI program. That branding effort may not have amounted to much, with Copilot+ AI PCs accounting for just 1.9 percent of all computers sold in the first quarter of 2025, but it did push OEMs, including ASUS, Dell and others to make more capable machines. It also saw Microsoft rework Windows to better support ARM-based processors from Qualcomm. Still, it's hard to see how Windows manufacturers can challenge Apple by going back to existing or older x86 chips with with less RAM. 

Qualcomm's Snapdragon X2 processors could offer a potential response, but there are question marks there too. At CES 2026, the company announced the Snapdragon X2 Plus, a pared down version of its X2 Elite chipset with a six-core CPU. On paper, it should offer similar performance to the A18 Pro, but it doesn't seem Qualcomm has produced the chip at scale or that Windows OEMs have shown much interest in it. As of the writing of this story, the company's website lists just four X2 Plus-equipped models. I was only able to find one of those in stock, the $1,050 HP Omnibook 5. It has an OLED screen and more RAM than the Neo. Could HP repurpose something like the Omnibook 5 to take on the Neo? Maybe, but I'm not sure there's getting around the need for 16GB to get Windows 11 running decently.    

Even if the Snapdragon X2 Plus offers a stopgap measure, no company operates a supply chain quite like Apple. It has spent billions of dollars to make itself independent of companies like Qualcomm by designing its own Wi-Fi and Bluetooth chips, for example. It also doesn't need to pay Microsoft a licensing fee to use a bloated Windows 11. Those are all factors that lead to OEMs like ASUS and Lenovo operating on razor thin margins.  

Per Statista, Apple earned a nearly 36.8 percent gross profit margin on its products in 2025. That's almost exactly half as much as the gross margin it made on services, which grew to a record 75.4 percent last year. For comparison, ASUS has seen its profit margins erode to about 15.3 percent in recent quarters, or less than a third of Apple's 2025 average of 46.9 percent. For ASUS and other Windows OEMs, the short-term outlook isn’t good. HP recently told investors RAM now accounts for more than a third of the cost of its PCs. And if memory shortages continue, many of them will be forced to raise their prices to protect their margins. 

Apple is in no such position. The iPhone recently had its best quarter ever, contributing $85.27 billion to the company's Q1 revenue. The fact that Mac revenue declined from $8.9 billion to $8.3 billion year-over-year didn't make a dent to Apple's bottom line. For the companies that must now compete against the Neo, it's not a fair playing field. To Lenovo, Dell, HP and ASUS, PC sales are almost everything to their business. For Apple, it's a side hustle.       

As the company prepares to kick off its 51st year, it should consider it may never be in a better position to claw ahead in the market where it all started for the company. In both the PC and smartphone segments, Apple's market share has always been a distant second (and sometimes third and forth) to Windows and Android, in part because commoditization has consistently worked against the company. But when a single part now accounts for a third of the cost of a new PC, the regular rules don't apply. 

It's not just that the company is better insulated than nearly every other player against runaway RAM costs, it's that it also has a technological edge and the profit margins to compete on price at the same time. In recent quarters, the company's share of the PC market has hovered around the 9 to 10 percent mark, meaning it's consistently been about the fourth largest manufacturer. 

For as long as the RAM shortage continues, Apple should seriously consider sacrificing some of its PC profits to become a bigger player. So far, the company has moved to protect the margins on its more expensive devices. For example, it increased the price of the latest MacBook Air and MacBook Pro by $100. The company doubled the amount of base storage to make up for the hike. 

Moving forward, it should do everything it can to maintain, and maybe even lower the price of its computers to a point where its competitors can't meet it. If the Lenovos and HPs of the world can't compete on either price or performance, consumers will move to Mac computers. As Apple looks to the next 50 years, it may not get another opportunity like the one it has right now. 

This article originally appeared on Engadget at https://www.engadget.com/computing/laptops/the-ram-crisis-is-apples-best-chance-in-decades-to-capture-the-pc-market-130000672.html?src=rss

©

50 years of Apple pushing tech forward, for better or worse

Over the last 50 years, Apple reimagined personal computers, catalyzed the era of the smartphone, enlarged an iPhone and called it the iPad and garnered a strong position in wearable tech through its Watch series and its AirPods. It also popularized software and services like its App Store, FaceTime, iCloud, iMessages and many more. For a lot of us, the first time we pinched-to-zoom on a photo was likely on an iPhone.

However, Apple gives and it takes away. Things have had to change, be removed and consumers have to move on to whatever's new. For better or worse, the weight of Apple's influence has led to entire product categories following suit. Or, more typically, there's resistance, complaining and then… following suit. With the benefit of hindsight, most of these cases are examples of Apple seeing where technology was going and getting ahead of a transition that would have been inevitable. Often, these transitions have caused short-term pain for some, but time has proven Apple (mostly) correct about dropping older tech.

As Sir Arthur Quiller-Couch once said: murder your darlings. Here are some of the darlings we’ve lost over the years.

The death of the disk drive (1998)

This is a two-parter. The iMac G3 marked Steve Jobs' return. The colorful all-in-one Mac was a new start in many ways. In 1998, Apple ditched the standard ports and myriad cable types of personal computers, going all in on USB and a little-known thing called the internet. (In fact, that’s what the ‘i’ in iMac stands for.)

In doing so, it also ditched the 3.5-inch floppy disk drive — although it did have a read-only optical disk drive. Even with sluggish internet and USB transfer speeds at the time, the convenience was plain to see and it led to a decade of thumb drives of ever-increasing storage limits. High-capacity alternatives to the floppy disk, like the Zip disk and even Minidisc, attempted to bridge the gap, but never gained the widespread traction and adoption of the original disk drive. But flash drives and, later, internet-based file storage quickly made them obsolete anyway. Apple was just a little early with its dismissal.

Portable music players (2007)

Despite Apple’s iPod being the de facto music player at the time, it was supplanted by the company’s own biggest hit: the iPhone. At its peak, the iPod made Apple the zeitgeisty tech company it is today. It dominated the MP3 player market, and by 2006, iPods were responsible for 40 percent of the company’s revenue. And that was before the era of Apple including a free U2 album with every iTunes account.

When the iPhone launched in June 2007, it was swiftly followed by the iPod Touch in September. This was the iPhone without the phone part — indicating how the company saw the future of music listening. You didn’t need an iPod if you already had an iPhone in your pocket. It’s the best example of Apple cannibalizing a product that defined a decade with something far more impressive and, eventually, more successful.

It was a slow death. Ignoring the countless MP3-playing rivals, (RIP Zune), Apple dropped the classic iPod in 2014. It soon did the same to the tiny iPod nano and iPod shuffle in 2017. Finally, the company discontinued the iPod Touch in May 2022.

The physical smartphone keyboard (2007 plus change)

A BlackBerry on a rock.
Unsplash / Thai Nguyen

When the iPhone’s capacitive screen and touch keyboard landed, there was a learning curve. Moving from physical keys (whether it was a 9-key alphanumeric version or the BlackBerry’s QWERTY experience) to a touch screen, especially on the tiny 3.5-inch panel of the first iPhone, wasn’t easy.

But it was the future. Physical keyboards took up physical space on devices — especially as those screens grew and grew. The adoption of touch keyboards sped up, thanks to third-party keyboard apps on Android, like Swype, SwiftKey and many others, introducing different input methods, smarter predictive text, typing algorithms and even touch heatmaps. Software keyboards were intrinsically more versatile, supporting multiple languages, infinite key arrangements and eventually emoji galleries. A colon-ellipsis smiley soon didn’t hit the same.

The death of the disk drive, part 2 (2008)

The MacBook Air, introduced by Steve Jobs in 2008, was famously pulled from a manila envelope to demonstrate its ultraportable design. To achieve that slimness, it had to ditch the internal optical drive entirely, making it the first MacBook without one. That move kickstarted an era of ultraportable laptops.

It was a major break from what laptop users were used to, and Apple tried to offer people some options. Apple introduced "Remote Disc," a feature which allowed the Air to wirelessly use the optical drive of a nearby Mac or PC, and offered an external USB SuperDrive as an optional accessory. (I've used mine exactly once since I bought it in 2013.)

While it was considered underpowered compared to Windows competitors, the original MacBook Air set a new design standard for the industry. It positioned Apple’s Macs for a future of App Store software installations, faster internet connectivity, and the rise of streaming media, cloud storage, and the rest. Apple’s MacBook Pro and MacBooks eventually followed suit, ditching optical drives in 2012.

Adobe Flash (2010)

Thoughts on Flash
Apple

In the early days of the iPhone, Apple famously refused to support Adobe Flash. This was in the early 2000s, too, when much of the web was built with Flash for animations and video support. The iPhone and iPad notably lacked support, creating a fractured browsing experience for years.

In April 2010, just as the first iPad arrived, Steve Jobs published his "Thoughts on Flash" open letter, criticizing its poor security and a lack of touch-friendliness. Many Flash games and interfaces interacted with the mouse cursor's precise position, something that was invisible on the touchscreen iPhone.

It was also a calculated move. By denying Adobe access to the rapidly growing iOS user base, Apple forced developers to choose between sticking with the aging Flash or embracing open standards like HTML5. Also, by making Flash-based games and tools incompatible, it nudged those developers (and iPhone users) toward the App Store for those very games and tools (and more). There, Apple could curate and monetize those creations.

It was a slow death: Adobe finally discontinued Flash in 2020.

The headphone jack (2016)

Leszek Kobusinski / Alamy

In a move described by Apple marketing executive Phil Schiller as "courage,” nixing the headphone socket ended up becoming the biggest headline to come from the iPhone 7 launch in 2016. Every flagship iPhone since has lacked the jack, with the most recent iPhone to include it being the original iPhone SE.

To make the change more palatable, Apple bundled a Lightning-to-3.5mm adapter (expect more dongle chat later) with the iPhone 7, 8 and X. In-box headphones also swapped from the typical jack to Lightning. Naturally, this meant you couldn’t charge the phone while you listened to music, unless you already had a pair of wireless headphones.

Of course, this move was ultimately instrumental in making true wireless earbuds ubiquitous. While Apple wasn’t remotely the first company to introduce wireless earbuds (and then headphones), the removal of the headphone jack undoubtedly sped up adoption. Pour one out for the Bragi Dash, the Jabras, the Jaybirds of this world.

Conveniently, alongside the aforementioned iPhone 7, Apple announced the AirPods. Features like one-tap setup and automatic pairing brought the convenience people expected of Apple and put it into a tiny white case.

Despite early resistance and "bragging" from rivals who clung onto the headphone jack, at this point, the socket is mostly confined to cheaper smartphones or phones aimed at audiophiles (hi, Sony) or mobile gamers (ASUS ROG).

Eventually, the iPad Pro also lost its headphone jack, and the rest of the company's tablets followed. The only non-Mac device to keep the jack? The iPod Touch, which had one until its discontinuation in 2022.

Bespoke ports (2016)

MacBook Pro dongles
Engadget

2016 was the year of donglegate. Apple’s MacBook Pro redesign that year was another drastic shift in the laptop's history. Chasing ever-thinner profiles and less port fuss, Apple stripped away nearly every legacy connector that professionals relied on. This was particularly jarring after the previous-generation MacBook Pro (2015) was often cited as the peak of utility, with a MagSafe charging port, two Thunderbolt 2 ports, two USB-A ports, not to mention a full-size HDMI port and an SD card slot.

Those were replaced with four (or on the cheapest 13-inch MBP only two!) Thunderbolt 3 USB-C ports and a headphone jack. For power users (like some Engadget editors), it demanded dongles (possibly multiple ones) in order to connect your USB-A thumb drive, wired internet, SD cards, external screens and well, at that point, pretty much everything. Many were particularly furious with the loss of the MagSafe charging connector. Of course, this also meant that one of those USB-C ports would be used primarily to charge the MBP. This sped up the availability of USB-C peripherals and accessories — perhaps because everyone was sick of carrying around so many dongles and hubs — but we still have USB-A devices. HDMI is everywhere. I still have SD Cards.

Eventually, Apple course-corrected itself. The 2021 MacBook Pro redesign reintroduced the SD card reader and HDMI port, and even MagSafe returned, freeing up a USB-C port.

This article originally appeared on Engadget at https://www.engadget.com/big-tech/50-years-of-apple-pushing-tech-forward-for-better-or-worse-170025862.html?src=rss

©

© Justin Sullivan via Getty Images

SAN FRANCISCO - SEPTEMBER 05: Apple CEO Steve Jobs speaks in front of a display of the new iPod products during an Apple Special event September 5, 2007 in San Francisco, California. Jobs announced a new generation of iPods as well as a partnership with Starbucks to access music being played at Starbucks coffee shops with the new iPod Touch. (Photo by Justin Sullivan/Getty Images)

Razer Blade 18 (2025) review: An 18-inch gaming laptop that does the most

As the 7-pound Razer Blade 18 sat on my desk, its all-black unibody case and enormous 18-inch screen towering before me like the monolith from 2001, I couldn’t help but think, “Who the hell needs such a big-ass computer?” I’m sure they’re out there — the gamers with deep pockets and little regard for portability, the video editors who demand as much screen space as possible. But on the whole, the market for the Blade 18 is pretty small, especially whenRazer’s Blade 14 and 16 strike a far better balance of price, performance and weight.

What the Razer Blade 18 promises, if you choose to accept its gargantuan proportions, is unbridled power and screen real estate. It’s running Intel’s new Core Ultra 9 275HX processor, a 24-core beast with a maximum speed of 5.4GHz. Its 18-inch screen can reach up to 240Hz at slightly over 4K (3,840 by 2,400 pixels) and 440Hz when downscaled to 1080p+ (1,920 by 1,200). And of course, you can equip it with NVIDIA’s fastest mobile GPU, the GeForce RTX 5090. Given everything under the hood, it’s honestly impressive it weighs just seven pounds, alongside a 2.1-pound power adapter. (In comparison, the similarly premium 18-inch Alienware Area 51 comes in at 9.5 pounds with a 2.2-pound power adapter.)

When we last reviewed the Razer Blade 18 a few years ago, my colleague Sam Rutherford bristled at the laptop’s size, battery life and high cost (which could reach upwards of $5,000 when fully decked out). All of those points are absolutely fair, but this time around it’s easier to see what Razer is trying to do with the Blade 18: It’s simply doing the most for the people who demand it. And it’s doing so with the excellent build quality we’ve come to expect from Razer (albeit with a high $2,799 starting price).

If you’re still trying to wrap your head around why an 18-inch laptop even exists, the Razer Blade 18 isn’t for you. And honestly, the concept isn’t even that farfetched. Given the move towards thinner display bezels and other refinements, laptop makers have been able to squeeze in larger screens inside of their typical case sizes. The Razer Blade 16 was a bit heavier than the Blade 15 when it launched, but now Razer has slimmed its case down considerably. The Blade 18 similarly serves as an upgrade to the old Blade 17 — and what an upgrade it is. 

Razer Blade 18 from the side, viewing a few ports.
Razer Blade 18 from the side, viewing a few ports.
Devindra Hardawar for Engadget

My review unit, which was equipped with that new Intel chip, an RTX 5090, 64GB of RAM and a 4TB SSD tackled Cyberpunk 2077 with all of its settings cranked without a sweat. At its native resolution, which again is a bit higher than 4K, it reached 131 fps with 4X frame generation (which uses DLSS 4’s upscaling to interpolate additional frames). That’s roughly half as fast as the desktop RTX 5090 running in 4K with the same settings — but don’t forget, that GPU alone typically runs between $2,000 to $3,000 these days. Razer charges an additional $1,400 to upgrade the Blade 18 from an RTX 5070 Ti to the 5090. (And for the record, the total cost for our fully decked out testing unit was $4,599.)

Beyond frame rates, Cyberpunk 2077 simply looked great on the Blade 18’s 240Hz IPS LED display. It’s not as bright as the MiniLED screens Razer offers on the Blade 16, and it doesn’t offer the insane contrast levels of an OLED screen, but it does the job well. For the price, though, it would have been nice to see more modern screen technology. Like the Blade 16, the 18 also offers a dual-mode display, which is how it reaches those higher 440Hz refresh rates in 1080p+. 

Razer Blade 18 rear case
Razer Blade 18 rear case
Devindra Hardawar for Engadget

It worked as advertised in Overwatch 2, where I played several matches well above 300fps with high quality settings. The additional visible frames are particularly helpful during fast-paced moments, where you may have the blink of an eye to take out an opponent before they headshot you. 

I had no doubt the Blade 18 would be fast, but I also noticed that it felt genuinely more immersive than the Blade 16 because of its more expansive display. As I leaned in during Cyberpunk 2077, Halo Infinite and Overwatch sessions, it almost felt like I was in front of a desktop setup. That’s ultimately what you’re paying for with this machine. When I opened up audio files in Audacity, I also noticed that the additional screen space simply made it easier to sift through my timelines.

PCMark 10

3DMark (TimeSpy Extreme)

Geekbench 6 CPU

Cinebench R23

Razer Blade 18 (Intel Core Ultra 9 275HX, NVIDIA RTX 5090)

7,703

12,228

2,733/19,340

1,104/33,150

Razer Blade 16 (2023, Intel i9-13950HX, NVIDIA RTX 4090)

7,364

8,667

2,713/16,245

2,024/15,620

Razer Blade 18 (2023, Intel i9-13950HX, NVIDIA RTX 4060)

7,326

5,009

2,708/12,874

1,900/15,442

 

When it comes to direct benchmarks, the Core Ultra 9 chip isn’t much better than Intel’s 13th-gen hardware in single-threaded tasks, and it’s sometimes best by AMD’s latest batch of hardware. Intel has made significant progress in multi-threaded tests like Geekbench 6, though, and that sort of performance makes the Blade 18 ideal for tasks like video rendering and complex games.

The Blade 18 also ran remarkably cool: During a 3DMark stress test, which involved running one demo 20 times in a row, the CPU stayed at 70 degrees Celsius most of the time, with occasional spikes to 85C. During the CPU-heavy Cinebench tests, Intel’s chip jumped to 80C on average with some jumps to 90C. The GPU, meanwhile, held a steady 70C and never wavered during 3DMark benchmarks. The fans sure can get loud, though, as you’d expect for a system that’s relatively thin and needs to pump out a ton of heat.

Razer Blade 18 power, Ethernet, USB 2 and USB-C ports.
Razer Blade 18 power, Ethernet, USB 2 and USB-C ports.

Razer has been building sturdy and attractive gaming laptops for well over a decade now, so it’s not a huge surprise that the Blade 18 feels incredibly solid and premium. Its keyboard has a great depth to it that feels just as good playing shooters as it does while typing, and its trackpad is wonderfully smooth and accurate. (It does get a bit overzealous when detecting multi-touch gestures, though.) Port-wise, the Blade 18 also packs in everything you’d want, including three USB Type-A 3.2 connections, one  Thunderbolt 5 USB-C port, a Thunderbolt 4 USB-C port, 2.5Gb Ethernet and a full-sized SD card slot.

Personally, if I had to choose between Razer’s current lineup, I’d go with the Blade 16 so that I could actually carry it around and occasionally use it as a productivity machine. Not so with the Blade 18 — its short two hour and 17 minute battery life (in PCMark 10’s battery benchmark) means you’ll always need to lug around its beefy power adapter. After an hour of writing this review, its battery life also dropped from fully charged to 38 percent. But really, nobody is buying this thing just to deal with spreadsheets and emails. You want ultimate power and an enormous screen? Then battery life will suffer.

A transparent window along the bottom of the Razer Blade 18
A transparent window along the bottom of the Razer Blade 18
Devindra Hardawar for Engadget

To paraphrase The Lord of the Rings, one does not simply choose to live with an 18-inch gaming laptop — not without considering all of the conveniences you’re leaving behind. For the sickos who would dare tread that path, the Blade 18 is a solidly built powerhouse that weighs significantly less than rivals like the 18-inch Alienware Area 51. Just be prepared to pay Razer’s high price to own one.

This article originally appeared on Engadget at https://www.engadget.com/gaming/pc/razer-blade-18-2025-review-an-18-inch-gaming-laptop-that-does-the-most-153000136.html?src=rss

©

Intel gives us a glimpse of its Panther Lake Core Ultra chips

If Intel wants to come back as a chip fab, its upcoming Core Ultra series 3 laptop processors will be a crucial part of that. The company has just revealed more information about those processors (codenamed Panther Lake), that will use its 2-nanometer 18A process and be built in the US at its Arizona plant. 

The Core Ultra series 3 system-on-chips will be utilized mainly in high-end laptops along with "gaming devices and edge solutions," Intel said. The company noted that they'd blend "Lunar Lake-level power efficiency with Arrow Lake-class performance," though it usually boasts that with all new Core chips.

They'll offer up to 50 percent more processing performance compared to previous generations, with some versions sporting as many as 16 performance cores (P-cores), along with efficiency cores (E-cores). Chip density will improve by 30 percent, while performance per watt will rise 15 percent. 

Intel's integrated Arc GPU also sees a 50 percent performance bump compared to the last generation, with a maximum of 12 cores in high-end versions. They'll also see an updated XPU design for AI acceleration with up to 180 Platform TOPS (trillions of operations per second).

Intel called its 18A architecture "the most advanced semiconductor node developed and manufactured in the United States," adding that it's "fully operational and set to reach high-volume production later this year." As recently as two months ago, however, the company was reportedly struggling with the yields it would need to even start production, let alone make a profit. 

It would be an understatement to say that Intel needs the new node to succeed. In August President Donald Trump said that the company's new CEO Lip-Bu Tan should resign before walking that back after a successful meeting between the two. Later, Trump announced that the US government was taking a 9.9 percent ($8.9 billion stake in Intel) and last month, NVIDIA said it was throwing Intel a $5 billion lifeline to the company forPC and data center CPUs. In its July Q2 earnings report, Intel said it lost $2.9 billion and would lay off up to 20 percent of its workforce. 

This article originally appeared on Engadget at https://www.engadget.com/computing/intel-gives-us-a-glimpse-of-its-panther-lake-core-ultra-chips-130010879.html?src=rss

©

© REUTERS / Reuters

Intel's logo is pictured during preparations at the CeBit computer fair, which will open its doors to the public on March 20, at the fairground in Hanover, Germany, March 19, 2017. REUTERS/Fabian Bimmer
❌