Excellent, fair review and perspective. The storage section at the end is especially incisive:
Apple should not be selling 16 GB iPads. The starting tier for typical consumers should be 32 GB. There’s just not enough usable space on a 16 GB iOS device to do the things Apple has worked so hard to make easy to do. …
I also understand the product marketing angle. That there are a lot of people who will look at the 16 GB models, see that they can get four times the storage for just $100 more, and buy the 64 GB model instead — when they would’ve bought the base model if it were 32 GB. I get it. There’s no doubt in my mind it’s good short-term business sense to go with a 16/64/128 lineup instead of 32/64/128. But Apple is not a short-term business. They’re a long-term business, built on a relationship of trust with repeat customers. 16 GB iPads work against the foundation of Apple’s brand, which is that they only make good products.
Apple has long used three-tier pricing structures within individual product categories. They often used to label them “Good”, “Better”, and “Best”. Now, with these 16 GB entry-level devices, it’s more like “Are you sure?”, “Better”, and “Best”.
This relates to my argument in last week’s Accidental Tech Podcast that Apple’s high hardware margins are a strategy tax that hinders other important factors within the company.
A balance must be struck between healthy profit margins and making good products. I don’t think Apple has ever only made good products,1 but the introduction of “Are you sure?” models has only come to iPhones and iPads in recent years — the first few generations of iOS devices (and iPods) were all high-end when they came out.
And until this year, the “Are you sure?” iOS devices were just older models that got pushed down the product line over time with embarrassingly low storage sizes, like 8 GB.2 But iOS, cameras, and apps have progressed to the point that even 16 GB devices are now constrained, exacerbated by iOS’ poor user storage management. This is the first year that Apple’s flagship iOS devices are available in sizes that will often result in poor user experiences.
Not long ago, we could just tell our friends and family to buy “an iPhone” or “an iPad”, unqualified. They were all great.
Now, if our parents call us and say, “We just got an iPad!”, we need to think, Oh no, which one did they get? Please don’t be one of the shitty ones…
The cheapest Macs have usually shipped with too little RAM. No computer can be called good today if it ships with less than 8 GB of RAM, like the base models of the Mac Mini, MacBook Air, and non-Retina MacBook Pro — or if it only ships with a hard drive instead of an SSD or Fusion Drive, like the base models of the Mac Mini, 13” MacBook Pro, and non-Retina iMacs. ↩
The free-with-contract iPhones have all been 8 GB, including the 5c this year. ↩
Twitter Inc. has a proposition for app makers: Let’s start over.
Two years ago, Twitter irked developers with stricter rules around applications that plug into the social-media service. This week Twitter hopes to regain their trust and attract a broader set of app makers when it hosts its first developer conference in four years.
As a peace offering, Twitter on Wednesday is expected to announce a suite of tools that aim to make it easier for programmers to build apps, according to people familiar with the matter.
I’m not sure whether this “let’s start over” talking point is coming from Twitter PR or the WSJ, but it’s misguided.
Twitter’s API requires OAuth not only for its alleged security improvements, which are weak, but also to control and limit app developers. If any app could make API calls with HTTPS Basic Auth like the original Twitter API, Twitter would have no reliable way to identify which requests came from which app, so they wouldn’t be able to enforce their restrictions and branding requirements. Any API that requires apps to register with the service and identify themselves with each request is politically unreliable because the service will always have a much bigger stick to wield whenever it’s convenient.
But that’s not the biggest problem — even an anonymous API is shaky ground because it can always change or disappear, like Twitter’s original API did. The problem is still the complete power over an increasingly important communication medium residing in a single company and its single centralized service.
Companies grow and change. Business needs change. Founders and leaders move on and get replaced.
Especially at Twitter. Twitter started out as a developer-friendly company, then they became a developer-hostile company, and now they’re trying to be a developer-friendly company again. If I had to pick a company to have absolute power over something very important, Twitter wouldn’t be very high on the list.
They’re not obsessed with messing with developers’ heads — we’re just innocent bystanders getting hit whenever this fundamentally insecure, jealous, unstable company changes direction, which happens every few years. Twitter is never happy being Twitter, and it seems at times that its leadership doesn’t realize or doesn’t value what makes it so great. (Ever wonder why there’s so much leadership turnover?) And they’re now under the financial pressures of being a high-profile public company. It’s a powder keg.
Maybe they’ll tell us how great we are this week and they won’t burn us again. And I’m sure the people saying that on stage at their conference will honestly believe that. But it’s only a matter of time before those people move on to different jobs, Twitter’s direction changes again, and developers suddenly find themselves in the wrong quadrant of the newest initiative.
Twitter will never, and should never, have any credibility with developers again. Enjoy it while it lasts, but be ready for it to disappear at any moment.
Mandrill: Integrate, deliver, track and analyze — email infrastructure from Mandrill. Use promo code accidentaltech for 50,000 free email sends per month for your first 6 months.
Squarespace: A better web starts with your website. Use coupon ATP for 10% off.
(I’m starting to put these bullets here instead of the old Sponsor One, Two, Three style because if I’m going to include these links at all, it’s more helpful to have some explanation and any relevant promo codes.)
The 5K Retina iMac is out, and it looks incredible so far on paper — so incredible that I’m seriously considering selling my new Mac Pro to get the Retina iMac instead. In fact, the case for the Mac Pro for anyone but advanced video editors, 3D modelers, and heavy OpenCL users is now weaker than ever.
The iMac starts at $2500 and the Mac Pro starts at $3000, but you shouldn’t buy the base model of either.
The best bang-for-the-buck CPU options are the 4 GHz CPU in the iMac and the 6-core in the Mac Pro. I recommend a minimum of 16 GB RAM — go 32 if you can — and a 512 GB or 1 TB SSD. The iMac offers Fusion Drive, but an all-SSD configuration is so much faster and more consistent that you should really get one if you can afford to.
With my recommended midrange configurations for each, the iMac certainly isn’t cheap, but it has a clear price advantage over the Mac Pro, especially since it includes its own display:
Mac Pro with 6-core, 16 GB, 512 GB SSD, D500: $4300
Mac Pros generally hold their value better over time, and whatever monitor you use with your Mac Pro can likely be kept and reused through multiple computers. But that’s a big price difference to overcome.
Intel’s next CPU cores (Broadwell) are significantly delayed, so in the meantime, they released a few more high-end Haswell models. The Retina iMac’s 4 GHz option is the Core i7-4790K, which is currently the fastest CPU in the world for most single-threaded tasks.
Since the Xeons in the Mac Pro are based on the even older Ivy Bridge microarchitecture, they’ve been lagging behind even the previous iMacs for single-threaded apps. According to early Geekbench reports, the 4 GHz, 4-core Retina iMac appears to be 25% faster than the 6-core Mac Pro in single-threaded tasks and only about 15% slower in multi-threaded tasks. That’s incredible.
We don’t know how the iMac’s GPUs are yet, but based on past choices, the iMac is likely to be better than the Mac Pro for games, but significantly worse for OpenCL and professional 3D applications.
The old Mac Pros had a huge advantage in expandability: they had up to 8 RAM slots, 4 internal hard drive bays, 4 PCI-Express slots, 2 optical bays, and tons of ports on the back. The new ones only have… tons of ports on the back.
The Mac Pro is still more expandable than the iMac in some ways. It has 6 Thunderbolt ports across 3 buses for more monitors and high-bandwidth external storage capacity, and it supports up to 64 GB of RAM instead of the iMac’s 32 GB ceiling. Otherwise, the differences are small.
5K versus 4K displays
This difference is much bigger than it sounds. It’s the same, proportionally, as the difference between typical 21- to 24-inch and 27- to 30-inch monitors: “4K” computer monitors have 8.3 megapixels, while “5K” has 14.7 megapixels. Without software scaling to simulate higher density, the “right” size for a 4K monitor tops out at 24 inches, while a 5K monitor looks right at 27 to 30 inches.
It’s a huge difference.
Waiting for an external Apple 5K display for Mac Pros or other Macs?
If I had to guess, you’ll have a long wait, and they won’t work with any Mac sold to date.
Panel yields may be tight for a while, and external displays are a low priority for Apple. The original 27” iMac’s groundbreaking LCD panel wasn’t available in an external display from Apple for almost a year after its release. But that’s not the biggest problem.
Pushing this many pixels requires more bandwidth than DisplayPort 1.2 offers, which is what Thunderbolt 2 ports use for outputting video signals. (I wrote about thisa few times.) Doing it right will require waiting until DisplayPort 1.3 in Thunderbolt 3 on Broadwell’s successor, Skylake, which isn’t supposed to come out for at least another year — and Intel is even worse at estimating ship dates than I am, so it’s likely to be longer.
It may be possible to use two DisplayPort 1.2 or Thunderbolt 2 cables to power a 5K display, but only if the GPU could treat each port as its own full-bandwidth DisplayPort 1.2 channel, the sum of which represented one logical display, and had the panel combine and properly sync the two at the other end.1 I don’t think any of the current Macs can do this, including the Mac Pro — MST to run 4K panels at 60 Hz only seems to be supported within individual ports, not spanned across two.
I’d estimate — granted, I’m wrong a lot — that Apple won’t ship a standalone 5K display until at least 2016, and it won’t work with any of today’s Macs, including the 2013 Mac Pro.
We don’t know whether it will work with the current Mac Pro yet. Just like the theoretical Apple external 5K monitor, it will rely on tricks like MST to be treated as one big monitor, which may be unsupported or buggy on the Mac Pro.
It’s also a Dell.2 Dell monitors used to be great, but their quality has been inconsistent and declining in recent years, and they’re certainly not known for their visual appeal or classy materials.
Waiting for Broadwell iMacs or Haswell-EP Mac Pros?
Broadwell-K CPUs suitable for iMacs are due out in about a year. Broadwell’s main improvement over Haswell is reduced power consumption, and while this matters a lot in laptops, it won’t be as important in desktops. I’d expect maybe a 10–15% performance improvement in a Broadwell iMac update next year.
If the Mac Pro gets updated to new CPUs anytime soon, they’ll be the new Haswell-EP Xeons. The midrange 6-core model, likely to remain the best bang-for-the-buck option, will likely use the Xeon E5-1650 v3. Here’s how it compares to the 4 GHz iMac CPU — the iMac still holds a big lead in single-threaded tasks, and doesn’t lose the multi-threaded test by too much considering it only has 4 cores.
You can also see how well the iMac’s i7-4790K performs against the new 10-core Xeons in AnandTech’s benchmark — it’s competitive in everything but the most parallel tasks.
So it’s unlikely that the relative performance between the iMac and Mac Pro will change in the near future. The iMac will remain close or faster in single-threaded tasks, and the Mac Pro will beat the iMac at multi-threaded and OpenCL tasks, with the multi-threaded gap being larger if you get the (much more expensive) 8- or 12-core Mac Pro.
Heat and fan noise
The Mac Pro is ridiculously quiet. With any ambient noise at all, you simply can’t hear it. And that’s true no matter what it’s doing — even under full load, I never hear it. The unified heatsink with the single giant, slow fan is remarkably good.
The Retina iMac uses the same internal design as the previous 27-inch iMac with heatpiped heatsinks cooled by one medium-sized fan, and the Retina’s overall thermal load seems similar. Apple claims the Retina iMac is only 15 dB in “wireless web” use — just 0.5 dB louder than the Mac Pro — although neither specify noise levels at sustained heavy loads, where I’d expect the iMac to be noticeably louder than the Mac Pro based on these designs.3
If you’re very concerned about minimizing heat and noise, consider your options carefully. The upgraded CPU and GPU, and choosing Fusion Drive instead of all-SSD, will each increase the amount of heat (and therefore fan noise).
Mac Pros use Xeons, server-class chipsets, error-correcting RAM, and workstation-class GPUs, all of which are designed more conservatively and with more strict tolerances than the consumer-grade components in laptops and iMacs.
In practice, I’ve always found the consumer-grade parts in laptops to be slightly buggy. Occasionally, they won’t come out of sleep properly, or they’ll kernel-panic for no apparent reason. But that may only happen a handful of times over the entire lifetime of the machine, so it’s not a huge problem — but an important difference to some.
I’d also worry about the amount of heat in the enclosure, especially so close to the screen. That’s going to be an expensive screen replacement if it’s out of warranty. Since AppleCare is so cheap on iMacs and this is a first-generation product, I’d get it.
So who’s the Mac Pro for?
At this point, not a lot of people:
People who heavily use OpenCL apps
People needing as much parallel CPU power as possible, such as professional video editors, who can afford the 8- or 12-core CPUs
Anyone using a lot of Thunderbolt devices
Anyone who needs a lot of monitors, an HDMI output, or two built-in network interfaces
People who need the quietest computer possible under any load
Roles in which a kernel panic or other slight hardware glitch may be very costly
This list keeps getting shorter over time. I think I finally fell off of it.
Many 4K monitors use this trick, called MST, to split themselves into logical left/right halves and run at their full resolution and framerate by acting as two monitors. In practice, MST is finicky, buggy, and poorly supported. This theoretical two-Thunderbolt-cables idea for 5K would be even more complex, and it may not be reasonably possible at all without weird sync issues like tearing between the two logical halves. ↩
I informally polled owners of recent 27” iMacs on Twitter about fan noise, and the consensus was that they’re near-silent the vast majority of the time, with the fans only becoming slightly audible under sustained parallel loads like video encoding. ↩
The sad truth has been that, while profitable from week one, the publication has had a declining subscription base since February 2013. It started at such a high level that we could handle a decline for a long time, but despite every effort — including our first-year anthology crowdfunded a bit under a year ago — we couldn’t replace departing subscribers with new ones fast enough.
I’m sad to see this go, but I’m unfortunately not surprised — the decline began under my ownership, and I couldn’t turn it around. Glenn wanted to give it a shot, so he took over, and took it far further and for much longer than I thought possible.
Many non-ideal factors and decisions I made up front probably contributed to The Magazine not being sustainable forever. But the biggest challenge was simply that running a magazine today is a really tough business. I thought making a high-quality app was the hard part that was keeping iPad magazines from being more successful, but the app turned out to be the easiest and least important part of the business.
What happens when a comedian who doesn’t give a shit about the advertising industry and can afford not to give a shit about the advertising industry is given an award by the advertising industry during their most self-congratulatory annual event. Amazing.
I don’t know why I didn’t watch this sooner. I have a lot to learn from it.
Inevitable hostility and abuse from the public applies to almost everyone with any degree of fame or micro-fame, but it seems especially brutal in the gaming industry. And that’s not new — the gaming media and the loudest fans have always been horrible to game producers. I don’t know how or why anyone stays in that business.
Dark mode is one of Overcast’s top feature requests — I think more people have asked for dark mode than streaming or video support. I’ve been assuming it wasn’t necessary since you don’t really spend much time looking at your podcast player, and situations in which minimal light emission is necessary (like the next-to-your-sleepy-spouse-in-bed scenario) often don’t facilitate audio playback.
Bono, Edge, Adam Clayton and Larry Mullen Jr believe so strongly that artists should be compensated for their work that they have embarked on a secret project with Apple to try to make that happen, no easy task when free-to-access music is everywhere (no) thanks to piracy and legitimate websites such as YouTube. Bono tells TIME he hopes that a new digital music format in the works will prove so irresistibly exciting to music fans that it will tempt them again into buying music—whole albums as well as individual tracks. The point isn’t just to help U2 but less well known artists and others in the industry who can’t make money, as U2 does, from live performance. “Songwriters aren’t touring people,” says Bono. “Cole Porter wouldn’t have sold T-shirts. Cole Porter wasn’t coming to a stadium near you.”
In Time’s forthcoming cover story, Bono hints that the band’s next record is “about 18 months away” and will be released under the new file format. “I think it’s going to get very exciting for the music business,” Bono tells Time, “[it will be] an audiovisual interactive format for music that can’t be pirated and will bring back album artwork in the most powerful way, where you can play with the lyrics and get behind the songs when you’re sitting on the subway with your iPad or on these big flat screens. You can see photography like you’ve never seen it before.”
If correct, this sure is a lot of misguided thinking and misplaced optimism.
If you’re actively using a screen, music competes with everything else that screen can do — and these days, that’s a lot. You’re lucky if people listen to music at all anymore, and the most you can usually hope for is that they have it on in the background while doing some other activity that doesn’t provide its own audio. The most important music-discovery platform in the world is YouTube.
So I can see why people in the music business might think it’s important to make and sell interactive, multimedia music formats (what decade is this?) to compete, but I don’t think they stand a chance. Every trend in music is going in the opposite direction.
Music sales are declining rapidly as more people switch to streaming services. That ship has sailed. It’s not turning around.1
Full albums are as interesting to most people today as magazines. Single songs and single articles killed their respective larger containers. This is true on both the supply and demand sides: most people don’t listen to full albums, and most bands don’t produce very good ones.2 People only care about hit singles. That ship has sailed, too.3
This alleged new format will cost a fortune to produce: people have to take the photos, design the interactions, build the animations, and make the deals with Apple. Bono’s talking point about helping smaller bands is ridiculous — smaller bands can barely afford professional production on the music, let alone these extras.
Apple doesn’t have the market power anymore to lock in a proprietary format’s success. When everyone was still buying on iTunes and listening on iPods, the chances of success were better, but that’s not the case today. The market is too diverse, especially with so much listening happening on streaming services and non-Apple devices that can’t and won’t display any of these extras.
So maybe this would have worked in the past. Maybe, say, in 2009, when Apple’s market power was more dominant and streaming services weren’t taking over music yet.
Fortunately, we don’t need to wonder how a theoretical new multimedia album format in 2009 would have fared, because Apple really launched one. Remember iTunes LP? It’s still around, but it never really took off, it hasn’t saved full-album sales, it hasn’t reduced piracy or the appeal of streaming services, and the music industry is still losing relevance.
Because just like every other hopeful music and movie format, people don’t value the “extras” very much. People value the music itself (just barely) and the convenience of playing it the way they want. That’s it.
So many people re-bought music they already owned on vinyl or cassettes through the shifts to CDs and digital downloads mostly because each medium was so much more convenient than its predecessor. Nobody bought CDs because the booklets were longer and had more liner notes than the fold-in cassette cards.
SACD and DVD-Audio never went anywhere, and Pono likely won’t,4 because imperceptibly better sound quality isn’t compelling enough to overcome the dramatic loss of convenience that new, proprietary formats bring to today’s world of ubiquitous music players.
There’s nothing Apple or Bono can do to make people care enough about glorified liner notes. People care about music and convenience, period.
As for “music that can’t be pirated”, I ask again, what decade is this? That ship has not only sailed long ago, but has circled the world hundreds of times, sunk, been dragged up, turned into a tourist attraction, went out of business, and been gutted and retrofitted as a more profitable oil tanker. Piracy is not the music industry’s real problem and never has been, and we have yet to come up with any audio or video medium that truly can’t be pirated.
In 2007, Steve Jobs wrote an essay called “Thoughts on Music” to attempt to pressure the big record labels into agreeing to DRM-free music sales. Here’s a portion of it:
Imagine a world where every online store sells DRM-free music encoded in open licensable formats. In such a world, any player can play music purchased from any store, and any store can sell music which is playable on all players. This is clearly the best alternative for consumers, and Apple would embrace it in a heartbeat. If the big four music companies would license Apple their music without the requirement that it be protected with a DRM, we would switch to selling only DRM-free music on our iTunes store. Every iPod ever made will play this DRM-free music.
Why would the big four music companies agree to let Apple and others distribute their music without using DRM systems to protect it? The simplest answer is because DRMs haven’t worked, and may never work, to halt music piracy.
Jobs likely had ulterior motives, as usual — he likely wanted easier negotiations and flexibility for future hardware and features, and probably knew the upcoming Amazon MP3 Store had negotiated DRM-free music and didn’t want to be at a competitive disadvantage.
But I bet he also truly disliked DRM, as a tasteful consumer, technologist, and human being, and wanted to abolish as much of it as he could.
The effort ended up succeeding, mostly. TV shows and movies from iTunes didn’t stand a chance of going DRM-free, but iTunes music did indeed lose its DRM in the coming months (this page is still up). And the world didn’t end. Piracy didn’t suddenly explode. Everything stayed mostly the same, except it was nicer to be a music customer.
Now that we’re all accustomed to DRM-free music, I think it would be a big mistake to ever launch a DRM-encumbered music format for purchasing again.6 It’s hard enough to get people to buy music today at all — the last thing the industry needs is another excuse for people not to care.
I say this as a frequent buyer of music and a very rare user of any streaming services. ↩
I say this as a fan, and exclusive listener, of full albums. The decline is obvious. ↩
This has been the case for decades, but most of the time in the physical-media days, the only way to get the hit singles was to buy full albums. Cassette and CD “singles” had few releases and were relatively uncommon because everyone in the chain — bands, record labels, and retailers — made more money on full-priced albums. ↩
Higher-than-CD-quality music isn’t new. If there was truly much demand for it, Apple would have already been pressured to sell it and support its playback, but effectively nobody cares. How compelling of an alternative to streaming services would a high-bitrate format be? “We’re going to make the music collection on your 16 GB iPhone 4–10 times larger for a benefit that’s impossible to hear on any headphones, let alone your EarPods or Beats.” If forced to choose sound quality or convenience, convenience wins every time. ↩
According to the Wayback Machine, the original page was taken down sometime between December 27, 2010 and January 27, 2011. I suspect, and hope, that this was a result of Apple’s poor website management rather than a deliberate action, although 2010’s similar Thoughts on Flash is still up. ↩
DRM will remain justifiable for streaming services. ↩
I haven’t reviewed any Kindles recently because they simply haven’t been noteworthy. Every new Kindle model would take two steps forward, one step back, and one weird stumble diagonally somewhere. The basic screen technology made tiny improvements over time, but the devices became cheaper, flimsier, and more disposable.
The Voyage is positioned as a new premium model. Among other improvements, it has a glass screen, magnesium case, and — most appealingly — what’s probably a Retina-class e-ink screen at 300 DPI. Now that’s something I want to see.
And they finally added page-turn buttons back… sort of. The ideal e-ink Kindle would have hardware page-turn buttons and a touch screen, and the Voyage is the first one to promise that, but instead of buttons, they’ve added “pressure-based page turn sensors with haptic feedback.”
You know what else is a pressure-based sensor with haptic feedback? A button.
I bet those are the diagonal stumble of this generation. Regardless, I’m excited enough about the high-DPI screen and page-turn “sensors” to take the risk and preorder it. If you’re ordering one and would like to support this site, I’d love if you used my affiliate link, as Kindles make great referral money. Thanks.
I shipped Overcast 1.0.4 using this great trick to replace the various-sized launch images (Default.png) with a storyboard. This lets Overcast show an accurate launch image on any-sized device, at any screen resolution, and in any orientation without me having to make and ship separate static images for each one.
Despite Overcast being marked iPhone-only in the Info.plist, it runs at full size on iPads instead of in the little “classic” windowed mode that usually wraps iPhone-only apps when running on iPads.
I don’t actually know for sure that this is related to the storyboard launch image, but it’s extremely likely that it’s an iOS 8 bug when handling this edge case (a storyboard launch image on an iPhone-only app running on an iPad).
Special thanks to David Dudovitz for bringing this to my attention. (My iPad is still updating to 8 and will allegedly take 17 more hours.) Here are some screenshots from David for your curiosity and amusement. It appears to work fully, but it looks ridiculous.
Anyway, this is definitely not how I wanted to launch an iPad version. I’m going to let this version continue to exist in this state for the time being because it’s not doing much harm (and might be better than the alternative, I suppose), but I’m going to attempt to make it less ridiculous in the next update.