As many users saw probably this week, macOS 12 (do we really want yearly OS updates... still?) Monterey was revealed to have Several macOS Monterey Features Unavailable on Intel-Based Macs. John Gruber of daringfireball cleverly observed these features all made use of the imaging pipelines or neural engine.

The chatter in the Mac groups is nail-in-the-coffin, or debating if the features really required an M1 (Spoiler: they don't, they really don't). However, what is truly revealing is how Apple will approach future iterations of macOS akin to iOS. Generally, this has (mostly) limited to iOS camera features that are one part lens but many, many parts software. Could the iPhone XS had night mode? My guess is most likely, but Apple will not spend the energy back-porting the feature to previous hardware configurations. Then, the features are intrinsically hardware related, like the iPhone 12's signal processor, used for Dolby Vision / Smart HDR 3 / temporal noise reduction.

What seems likely is that on macOS we could see old Macs left out of newer features, but not just the latest wifi protocol or faster storage like has always happened. Still, literal features, be it improvements to real-time color correction or perhaps a hypothetical ML library that quickly identifies and highlights elements in a photo.