Arbitrary decisions like this harm Apple’s relationship with developers. If Apple wants developers to keep creating innovative Today View widgets, then it needs to publish detailed, specific guidelines of what widgets can and cannot do. When blood, sweat, tears, and livelihoods are on the line, “I’ll know it when I see it,” doesn’t cut it as an App Store approval policy.
Exactly. It’s in both developers’ and Apple’s best interests that Apple not impose many capricious, unnecessary restrictions, especially as Apple increasingly needs developers to push boundaries to revitalize the iPad.
I don’t use the Heil PR40, but my Shure SM7B has many of the same characteristics and benefits. The SM7B also has an internal shockmount and a big foam pop-filtering cover, so I can omit separate shockmounts and filters, which I especially love.1
Although I can’t universally recommend the SM7B without its big caveat: it needs a lot of clean gain (but so does the PR40), which many inexpensive XLR-to-USB interfaces can’t give without a lot of noise (underlying hiss). Most known-good cheap options have been discontinued — I got an Mbox Mini 3 on eBay for about $120, but it died in less than a year. The Blackjack works, but is noticeably noisy. I haven’t found a suitable SM7B USB interface cheaper than the $600 Apogee Duet that’s still made, has very low noise at high gain, and is likely to last a long time. (I didn’t even like the Duet’s software reliance, so I went all the way to the all-hardware but very expensive USBPre 2.)
The SM7B is legendary. Many claim Thriller was recorded on one — I don’t know if that’s verifiable, but I was happier to see that Robin Quivers uses one. ↩
Verizon is altering the web traffic of its customers by inserting a Unique Identifier Header or UIDH, a temporary serial number that lets advertisers identify Verizon users on the web.
According to Jacob Hoffman-Andrews of the Electronic Frontier Foundation, the UIDH serves as a “perma-cookie” that can be read by any web server to “build a profile” of internet habits. …
Verizon has been using Relevant Advertising techniques for two years, but the tracking has gone largely unnoticed until recently, when extra data from Verizon customers was noticed. AT&T appears to be engaging in similar tracking activities, and is testing its own Relevant Advertising system.
Jason Snell reporting on PCalc’s after-the-fact rejection for the audacity of making a very nice Notification Center widget, in the fully supported way, that includes a little calculator:
This is an app that was accepted into the App Store, and is even being featured in the App Store as I write this. And now, a few weeks in, someone at Apple has decided that the app is too… what? Too useful?
Like the after-the-fact rejection of Launcher last month, this feels like the worst era of app review returning with a vengeance.
When decisions like this start happening, Apple needs to reevaluate the purpose of app review: to protect itself, its platform, and its customers from spam, fraud, abuse, and malware (and ensuring Apple gets its cut, which is reasonable).
By limiting the usefulness of Notification Center widgets, what is Apple protecting itself or its customers from?
Excellent, fair review and perspective. The storage section at the end is especially incisive:
Apple should not be selling 16 GB iPads. The starting tier for typical consumers should be 32 GB. There’s just not enough usable space on a 16 GB iOS device to do the things Apple has worked so hard to make easy to do. …
I also understand the product marketing angle. That there are a lot of people who will look at the 16 GB models, see that they can get four times the storage for just $100 more, and buy the 64 GB model instead — when they would’ve bought the base model if it were 32 GB. I get it. There’s no doubt in my mind it’s good short-term business sense to go with a 16/64/128 lineup instead of 32/64/128. But Apple is not a short-term business. They’re a long-term business, built on a relationship of trust with repeat customers. 16 GB iPads work against the foundation of Apple’s brand, which is that they only make good products.
Apple has long used three-tier pricing structures within individual product categories. They often used to label them “Good”, “Better”, and “Best”. Now, with these 16 GB entry-level devices, it’s more like “Are you sure?”, “Better”, and “Best”.
This relates to my argument in last week’s Accidental Tech Podcast that Apple’s high hardware margins are a strategy tax that hinders other important factors within the company.
A balance must be struck between healthy profit margins and making good products. I don’t think Apple has ever only made good products,1 but the introduction of “Are you sure?” models has only come to iPhones and iPads in recent years — the first few generations of iOS devices (and iPods) were all high-end when they came out.
And until this year, the “Are you sure?” iOS devices were just older models that got pushed down the product line over time with embarrassingly low storage sizes, like 8 GB.2 But iOS, cameras, and apps have progressed to the point that even 16 GB devices are now constrained, exacerbated by iOS’ poor user storage management. This is the first year that Apple’s flagship iOS devices are available in sizes that will often result in poor user experiences.
Not long ago, we could just tell our friends and family to buy “an iPhone” or “an iPad”, unqualified. They were all great.
Now, if our parents call us and say, “We just got an iPad!”, we need to think, Oh no, which one did they get? Please don’t be one of the shitty ones…
The cheapest Macs have usually shipped with too little RAM. No computer can be called good today if it ships with less than 8 GB of RAM, like the base models of the Mac Mini, MacBook Air, and non-Retina MacBook Pro — or if it only ships with a hard drive instead of an SSD or Fusion Drive, like the base models of the Mac Mini, 13” MacBook Pro, and non-Retina iMacs. ↩
The free-with-contract iPhones have all been 8 GB, including the 5c this year. ↩
Twitter Inc. has a proposition for app makers: Let’s start over.
Two years ago, Twitter irked developers with stricter rules around applications that plug into the social-media service. This week Twitter hopes to regain their trust and attract a broader set of app makers when it hosts its first developer conference in four years.
As a peace offering, Twitter on Wednesday is expected to announce a suite of tools that aim to make it easier for programmers to build apps, according to people familiar with the matter.
I’m not sure whether this “let’s start over” talking point is coming from Twitter PR or the WSJ, but it’s misguided.
Twitter’s API requires OAuth not only for its alleged security improvements, which are weak, but also to control and limit app developers. If any app could make API calls with HTTPS Basic Auth like the original Twitter API, Twitter would have no reliable way to identify which requests came from which app, so they wouldn’t be able to enforce their restrictions and branding requirements. Any API that requires apps to register with the service and identify themselves with each request is politically unreliable because the service will always have a much bigger stick to wield whenever it’s convenient.
But that’s not the biggest problem — even an anonymous API is shaky ground because it can always change or disappear, like Twitter’s original API did. The problem is still the complete power over an increasingly important communication medium residing in a single company and its single centralized service.
Companies grow and change. Business needs change. Founders and leaders move on and get replaced.
Especially at Twitter. Twitter started out as a developer-friendly company, then they became a developer-hostile company, and now they’re trying to be a developer-friendly company again. If I had to pick a company to have absolute power over something very important, Twitter wouldn’t be very high on the list.
They’re not obsessed with messing with developers’ heads — we’re just innocent bystanders getting hit whenever this fundamentally insecure, jealous, unstable company changes direction, which happens every few years. Twitter is never happy being Twitter, and it seems at times that its leadership doesn’t realize or doesn’t value what makes it so great. (Ever wonder why there’s so much leadership turnover?) And they’re now under the financial pressures of being a high-profile public company. It’s a powder keg.
Maybe they’ll tell us how great we are this week and they won’t burn us again. And I’m sure the people saying that on stage at their conference will honestly believe that. But it’s only a matter of time before those people move on to different jobs, Twitter’s direction changes again, and developers suddenly find themselves in the wrong quadrant of the newest initiative.
Twitter will never, and should never, have any credibility with developers again. Enjoy it while it lasts, but be ready for it to disappear at any moment.
Mandrill: Integrate, deliver, track and analyze — email infrastructure from Mandrill. Use promo code accidentaltech for 50,000 free email sends per month for your first 6 months.
Squarespace: A better web starts with your website. Use coupon ATP for 10% off.
(I’m starting to put these bullets here instead of the old Sponsor One, Two, Three style because if I’m going to include these links at all, it’s more helpful to have some explanation and any relevant promo codes.)
The 5K Retina iMac is out, and it looks incredible so far on paper — so incredible that I’m seriously considering selling my new Mac Pro to get the Retina iMac instead. In fact, the case for the Mac Pro for anyone but advanced video editors, 3D modelers, and heavy OpenCL users is now weaker than ever.
The iMac starts at $2500 and the Mac Pro starts at $3000, but you shouldn’t buy the base model of either.
The best bang-for-the-buck CPU options are the 4 GHz CPU in the iMac and the 6-core in the Mac Pro. I recommend a minimum of 16 GB RAM — go 32 if you can — and a 512 GB or 1 TB SSD. The iMac offers Fusion Drive, but an all-SSD configuration is so much faster and more consistent that you should really get one if you can afford to.
With my recommended midrange configurations for each, the iMac certainly isn’t cheap, but it has a clear price advantage over the Mac Pro, especially since it includes its own display:
Mac Pro with 6-core, 16 GB, 512 GB SSD, D500: $4300
Mac Pros generally hold their value better over time, and whatever monitor you use with your Mac Pro can likely be kept and reused through multiple computers. But that’s a big price difference to overcome.
Intel’s next CPU cores (Broadwell) are significantly delayed, so in the meantime, they released a few more high-end Haswell models. The Retina iMac’s 4 GHz option is the Core i7-4790K, which is currently the fastest CPU in the world for most single-threaded tasks.
Since the Xeons in the Mac Pro are based on the even older Ivy Bridge microarchitecture, they’ve been lagging behind even the previous iMacs for single-threaded apps. According to early Geekbench reports, the 4 GHz, 4-core Retina iMac appears to be 25% faster than the 6-core Mac Pro in single-threaded tasks and only about 15% slower in multi-threaded tasks. That’s incredible.
We don’t know how the iMac’s GPUs are yet, but based on past choices, the iMac is likely to be better than the Mac Pro for games, but significantly worse for OpenCL and professional 3D applications.
The old Mac Pros had a huge advantage in expandability: they had up to 8 RAM slots, 4 internal hard drive bays, 4 PCI-Express slots, 2 optical bays, and tons of ports on the back. The new ones only have… tons of ports on the back.
The Mac Pro is still more expandable than the iMac in some ways. It has 6 Thunderbolt ports across 3 buses for more monitors and high-bandwidth external storage capacity, and it supports up to 64 GB of RAM instead of the iMac’s 32 GB ceiling. Otherwise, the differences are small.
5K versus 4K displays
This difference is much bigger than it sounds. It’s the same, proportionally, as the difference between typical 21- to 24-inch and 27- to 30-inch monitors: “4K” computer monitors have 8.3 megapixels, while “5K” has 14.7 megapixels. Without software scaling to simulate higher density, the “right” size for a 4K monitor tops out at 24 inches, while a 5K monitor looks right at 27 to 30 inches.
It’s a huge difference.
Waiting for an external Apple 5K display for Mac Pros or other Macs?
If I had to guess, you’ll have a long wait, and they won’t work with any Mac sold to date.
Panel yields may be tight for a while, and external displays are a low priority for Apple. The original 27” iMac’s groundbreaking LCD panel wasn’t available in an external display from Apple for almost a year after its release. But that’s not the biggest problem.
Pushing this many pixels requires more bandwidth than DisplayPort 1.2 offers, which is what Thunderbolt 2 ports use for outputting video signals. (I wrote about thisa few times.) Doing it right will require waiting until DisplayPort 1.3 in Thunderbolt 3 on Broadwell’s successor, Skylake, which isn’t supposed to come out for at least another year — and Intel is even worse at estimating ship dates than I am, so it’s likely to be longer.
It may be possible to use two DisplayPort 1.2 or Thunderbolt 2 cables to power a 5K display, but only if the GPU could treat each port as its own full-bandwidth DisplayPort 1.2 channel, the sum of which represented one logical display, and had the panel combine and properly sync the two at the other end.1 I don’t think any of the current Macs can do this, including the Mac Pro — MST to run 4K panels at 60 Hz only seems to be supported within individual ports, not spanned across two.
I’d estimate — granted, I’m wrong a lot — that Apple won’t ship a standalone 5K display until at least 2016, and it won’t work with any of today’s Macs, including the 2013 Mac Pro.
We don’t know whether it will work with the current Mac Pro yet. Just like the theoretical Apple external 5K monitor, it will rely on tricks like MST to be treated as one big monitor, which may be unsupported or buggy on the Mac Pro.
It’s also a Dell.2 Dell monitors used to be great, but their quality has been inconsistent and declining in recent years, and they’re certainly not known for their visual appeal or classy materials.
Waiting for Broadwell iMacs or Haswell-EP Mac Pros?
Broadwell-K CPUs suitable for iMacs are due out in about a year. Broadwell’s main improvement over Haswell is reduced power consumption, and while this matters a lot in laptops, it won’t be as important in desktops. I’d expect maybe a 10–15% performance improvement in a Broadwell iMac update next year.
If the Mac Pro gets updated to new CPUs anytime soon, they’ll be the new Haswell-EP Xeons. The midrange 6-core model, likely to remain the best bang-for-the-buck option, will likely use the Xeon E5-1650 v3. Here’s how it compares to the 4 GHz iMac CPU — the iMac still holds a big lead in single-threaded tasks, and doesn’t lose the multi-threaded test by too much considering it only has 4 cores.
You can also see how well the iMac’s i7-4790K performs against the new 10-core Xeons in AnandTech’s benchmark — it’s competitive in everything but the most parallel tasks.
So it’s unlikely that the relative performance between the iMac and Mac Pro will change in the near future. The iMac will remain close or faster in single-threaded tasks, and the Mac Pro will beat the iMac at multi-threaded and OpenCL tasks, with the multi-threaded gap being larger if you get the (much more expensive) 8- or 12-core Mac Pro.
Heat and fan noise
The Mac Pro is ridiculously quiet. With any ambient noise at all, you simply can’t hear it. And that’s true no matter what it’s doing — even under full load, I never hear it. The unified heatsink with the single giant, slow fan is remarkably good.
The Retina iMac uses the same internal design as the previous 27-inch iMac with heatpiped heatsinks cooled by one medium-sized fan, and the Retina’s overall thermal load seems similar. Apple claims the Retina iMac is only 15 dB in “wireless web” use — just 0.5 dB louder than the Mac Pro — although neither specify noise levels at sustained heavy loads, where I’d expect the iMac to be noticeably louder than the Mac Pro based on these designs.3
If you’re very concerned about minimizing heat and noise, consider your options carefully. The upgraded CPU and GPU, and choosing Fusion Drive instead of all-SSD, will each increase the amount of heat (and therefore fan noise).
Mac Pros use Xeons, server-class chipsets, error-correcting RAM, and workstation-class GPUs, all of which are designed more conservatively and with more strict tolerances than the consumer-grade components in laptops and iMacs.
In practice, I’ve always found the consumer-grade parts in laptops to be slightly buggy. Occasionally, they won’t come out of sleep properly, or they’ll kernel-panic for no apparent reason. But that may only happen a handful of times over the entire lifetime of the machine, so it’s not a huge problem — but an important difference to some.
I’d also worry about the amount of heat in the enclosure, especially so close to the screen. That’s going to be an expensive screen replacement if it’s out of warranty. Since AppleCare is so cheap on iMacs and this is a first-generation product, I’d get it.
So who’s the Mac Pro for?
At this point, not a lot of people:
People who heavily use OpenCL apps
People needing as much parallel CPU power as possible, such as professional video editors, who can afford the 8- or 12-core CPUs
Anyone using a lot of Thunderbolt devices
Anyone who needs a lot of monitors, an HDMI output, or two built-in network interfaces
People who need the quietest computer possible under any load
Roles in which a kernel panic or other slight hardware glitch may be very costly
This list keeps getting shorter over time. I think I finally fell off of it.
Many 4K monitors use this trick, called MST, to split themselves into logical left/right halves and run at their full resolution and framerate by acting as two monitors. In practice, MST is finicky, buggy, and poorly supported. This theoretical two-Thunderbolt-cables idea for 5K would be even more complex, and it may not be reasonably possible at all without weird sync issues like tearing between the two logical halves. ↩
I informally polled owners of recent 27” iMacs on Twitter about fan noise, and the consensus was that they’re near-silent the vast majority of the time, with the fans only becoming slightly audible under sustained parallel loads like video encoding. ↩
The sad truth has been that, while profitable from week one, the publication has had a declining subscription base since February 2013. It started at such a high level that we could handle a decline for a long time, but despite every effort — including our first-year anthology crowdfunded a bit under a year ago — we couldn’t replace departing subscribers with new ones fast enough.
I’m sad to see this go, but I’m unfortunately not surprised — the decline began under my ownership, and I couldn’t turn it around. Glenn wanted to give it a shot, so he took over, and took it far further and for much longer than I thought possible.
Many non-ideal factors and decisions I made up front probably contributed to The Magazine not being sustainable forever. But the biggest challenge was simply that running a magazine today is a really tough business. I thought making a high-quality app was the hard part that was keeping iPad magazines from being more successful, but the app turned out to be the easiest and least important part of the business.
What happens when a comedian who doesn’t give a shit about the advertising industry and can afford not to give a shit about the advertising industry is given an award by the advertising industry during their most self-congratulatory annual event. Amazing.
I don’t know why I didn’t watch this sooner. I have a lot to learn from it.
Inevitable hostility and abuse from the public applies to almost everyone with any degree of fame or micro-fame, but it seems especially brutal in the gaming industry. And that’s not new — the gaming media and the loudest fans have always been horrible to game producers. I don’t know how or why anyone stays in that business.
Dark mode is one of Overcast’s top feature requests — I think more people have asked for dark mode than streaming or video support. I’ve been assuming it wasn’t necessary since you don’t really spend much time looking at your podcast player, and situations in which minimal light emission is necessary (like the next-to-your-sleepy-spouse-in-bed scenario) often don’t facilitate audio playback.