Benchmark scores for the Apple M1 Ultra match — and in some cases, exceed — the top-spec x86 CPUs from Intel and AMD, but does the difference between RISC and CISC processors still exist? Apple announced the M1 Ultra on March 8 as its new top-of-the-line desktop SoC. The chip powers the company's all-new Mac Studio that comes with the form factor of a Mac mini but the power and versatility of the Mac Pro.
With the Apple M1 Ultra launch, many say the difference between RISC and CISC has become irrelevant. However, that may not be an exact representation of today's microprocessor scene. So what exactly is the difference between modern RISC and CISC CPUs in an era when the performance gulf between ARM processors and x86-64 offerings from Intel and AMD is getting increasingly blurred?
Related: M1 Ultra Vs. M1 Max: How The Two Apple Chips Compare
Apple's M1 series of chips are based on ARM architecture, which is a RISC (Reduced Instruction Set Computer) processor. Over the past decades, CISC (Complex Instruction Set Computer) processors, which use a larger set of complex machine language instructions, have traded blows with RISC chips, which use a reduced set of simpler instructions. While CISC has dominated more recently, the M1 Ultra showed just how much the gap between the two standards has reduced in recent times. Still, despite the fast and efficient Apple silicon and continued gains for ARM in the data center market, x86 is far from doomed and will remain relevant in the foreseeable future.
It is important to note that 'x86 vs. ARM' doesn't necessarily translate to 'CISC vs. RISC' circa 2022. While it might have been true a few decades ago, the terms have become more ambiguous over the years with both ISAs borrowing
Read more on screenrant.com