Activity › Forums › Creative Community Conversations › A Mac Pro prediction
-
Walter Soyka
October 29, 2012 at 9:07 pm[Chris Kenny] “Look, you can play the multiplication game all day long. What if I need five streams of 4K at 60 frames a second? Hey, that’s 22 GB/s a second, good luck finding any sort of RAID controller for that.”
It’s not a multiplication game — I pointed out a real-world scenario that I see every day.
If you need 22 GB/s, you’ll be I/O-bound on any system.
Are you really arguing that since you can’t attain unreasonably fast speeds that no system offers, you shouldn’t bother attaining the reasonably fast ones that other competing systems can?
A workstation needs to be a balanced system — the fastest sizzle core processors in the world will be of no use to you if you can’t read data off the disk subsystem fast enough to feed them. If you’re building in a bottleneck, you’re building a poor workstation.
[Chris Kenny] “Most projects don’t need that. They’re not even processed or finished in 16bpc RGBA, and if they are, it’s not the end of the world if you have to render the handful of shots that actually require compositing (i.e. multiple streams) — certainly most users won’t spend thousands of extra dollars on storage to avoid it.”
Maybe most of your projects don’t need that — but nearly all of mine do. Sending multi-channel 16-bit 3D renders to compositing is an everyday workflow in my corner of the industry.
Different users have different needs. I certainly agree with you that fewer users need workstations than before, because regular desktops have become powerful enough and offer fast enough throughput.
Workstations are a demanding niche, no doubt, and maybe Apple isn’t interested in trying to meet those needs. We’ll have to see.
[Chris Kenny] “Because doing so will allow Apple to make it smaller, quieter, less expensive. In other words, a better system for the vast majority of pro customers who don’t (with the existence of Thunderbolt) need quite that much internal expansion. I know I’d find a significantly smaller Mac Pro very convenient for use on location, for instance.”
I disagree. I’ve discussed this several times with Craig, but I don’t see how slots add much space, noise, or cost to the system. These things comes from those big hot CPUs, big hot RAM, big hot GPU, and the big hot power supply to keep them and the cooling they all requires running.
Further, you are not actually saving any space, noise, or cost by skipping PCIe slots when you are then adding in outboard Thunderbolt video I/O and storage. You’re just moving it around, and sacrificing speed for flexibility.
That said, I do want you to have a smaller option for on location use. I could use a more powerful, more mobile configuration myself. I just fear it will come from Apple at the expense of a larger, more powerful, more expandable option for in-studio use.
Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog – What I’m thinking when my workstation’s thinking
Creative Cow Forum Host: Live & Stage Events -
Chris Kenny
October 29, 2012 at 9:36 pm[Walter Soyka] “Are you really arguing that since you can’t attain unreasonably fast speeds that no system offers, you shouldn’t bother attaining the reasonably fast ones that other competing systems can?”
What I’m arguing is that for any set of specifications, in this business you can come up with a use case that requires even more. Apple has never targeted the very highest reaches of this market. To understand what Apple will likely do, one should look at common industry practice, not evaluate whether the proposed system will meet any obscure high-end requirement someone wants to name.
[Walter Soyka] “Maybe most of your projects don’t need that — but nearly all of mine do. Sending multi-channel 16-bit 3D renders to compositing is an everyday workflow in my corner of the industry.”
And you’re doing this with external media how? By moving 8+ spindle SAS RAID enclosures around? And you do so much of it that outputting a little more slowly would be a major problem? Again, I’m not saying nobody does this stuff, but — and this isn’t meant as an attack on you, because I have no idea what your particular situation is — there’s a tendency in this industry to play this sort of game where, instead of looking at what’s actually, realistically required to perform certain tasks, and making a reasonable cost/benefit analysis, people say things that amount to “Oh, well, I’d use X equipment to do Y task, and if you’re advocating doing Y with something less than X, you’re not as serious as I am”.
The truth is, Moore’s Law has brought us to the point where there are seriously diminishing returns on high-end specialty computing gear, and the higher you get the more they diminish.
[Walter Soyka] “Further, you are not actually saving any space, noise, or cost by skipping PCIe slots when you are then adding in outboard Thunderbolt video I/O and storage. You’re just moving it around, and sacrificing speed for flexibility.”
That’s potentially true of a fully-loaded system, but slots and drive bays take up space whether they’re full or not, and take up the same space regardless of the size of what’s in them. An external UltraStudio Mini Monitor adds a lot less volume to a machine than a slot that can potentially accommodate a full-length PCIe card does.
Everything keeps getting faster. At some point, it gets so fast that it actually makes quite a lot of sense to start sacrificing some speed for additional flexibility. In the consumer market, it’s exactly this phenomenon that has lead so many people to laptops and now to tablets. In our market, “What slot am I going to install a RedRocket card in?” will someday seem as pointless as “What slot am I going to install this SoundBlaster card in?” does in the modern consumer computing world.
—
Digital Workflow/Colorist, Nice Dissolve.You should follow me on Twitter here. Or read our blog.
-
Walter Soyka
October 29, 2012 at 11:09 pmChris, I don’t disagree with you on much here. Computers are getting faster and cheaper, and general-purpose computers are more capable for a great many tasks that used to need specialty systems. I understand why Apple may not want to pursue those markets.
I am responding to the point you raised about what “a ‘pro’ Mac should do.”
Major engineering changes to the Mac Pro may result in it becoming a less suitable choice in some production niches. There would be people who were well-served by the 2006, 2008, and 2010 Mac Pros at their respective launches who would be ill-served by a 2013 with limited expansion.
[Chris Kenny] “To understand what Apple will likely do, one should look at common industry practice, not evaluate whether the proposed system will meet any obscure high-end requirement someone wants to name.”
What does that mean? Common industry practices are different in the desktop market and the workstation market.
And I don’t think I’m naming obscure high-end requirements. (I’ll get to those in a minute). Here I was naming real-world requirements for 3D graphics and compositing work for HD material.
What requirements do you think would be fair to expect the successor to the Mac Pro to meet?
[Chris Kenny] “And you’re doing this with external media how? By moving 8+ spindle SAS RAID enclosures around? And you do so much of it that outputting a little more slowly would be a major problem?”
My writing was unclear. It’s actually working with the media and rendering from the media where performance counts. When you have slow disks, not only do you render slower, but you scrub slower and you preview slower. It’s the death of a thousand paper cuts.
I work in large-format animation, so I have some very atypical needs. I did my biggest project yet in terms of data this summer, where a minute-long animation generated 972 GB (not a typo) of multi-pass image sequences. That’s a bit over 16 GB/s. Retrieving a single frame meant reading over 500 MB off disk. Every little bit of performance counted, but I was still disk-bound the whole time.
I do understand that I’m a niche within a niche, but I used to be all Mac, and now I don’t feel that I could be anymore even if I wanted to be.
[Chris Kenny] “Everything keeps getting faster. At some point, it gets so fast that it actually makes quite a lot of sense to start sacrificing some speed for additional flexibility. In the consumer market, it’s exactly this phenomenon that has lead so many people to laptops and now to tablets. In our market, “What slot am I going to install a RedRocket card in?” will someday seem as pointless as “What slot am I going to install this SoundBlaster card in?” does in the modern consumer computing world.”
Soyka’s Law: Expectations rise at the same rate as capabilities.
Of course the day will come when RedRockets seem quaint — but we’re not there yet.
640KB of RAM was not enough for anyone. Computers are not fast enough — especially as higher resolutions or higher frame rates come into play.
Again, I’m not really trying to argue with you. I agree with most of your points. I’m just trying to point out that the laments about a more limited Mac Pro design would be real for some of us.
Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog – What I’m thinking when my workstation’s thinking
Creative Cow Forum Host: Live & Stage Events -
Walter Soyka
October 29, 2012 at 11:17 pmAlso, I know a bunch of us here (at least Chris, Craig, and Herb) are in the New York area. I hope everyone and their families are safe and are weathering the hurricane well.
Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog – What I’m thinking when my workstation’s thinking
Creative Cow Forum Host: Live & Stage Events -
Chris Kenny
October 29, 2012 at 11:50 pm[Walter Soyka] “I am responding to the point you raised about what “a ‘pro’ Mac should do.”
Major engineering changes to the Mac Pro may result in it becoming a less suitable choice in some production niches. There would be people who were well-served by the 2006, 2008, and 2010 Mac Pros at their respective launches who would be ill-served by a 2013 with limited expansion.”
I’m just not sure that’s really true, at least not to a large extent. For instance, you could stick multiple graphics cards in a 2008 Mac Pro at launch, but graphics cards are now so much faster than they were in 2008 that whatever you were doing with multiple cards then can probably be done with a single card now. Maybe there might be some hit over the immediately previous generation, but then again, maybe not with NVIDIA playing with ideas like sticking two GPUs on a single card (the GTX 690, not that there will be enough internal power connectors in any Mac Pro for one of those).
Another thing to consider is that in many instances you run out of physical slots long before you run out of bandwidth. Video I/O and more ‘mainstream’ RAID (i.e. RAID systems in the range of ~300-800 MB/s) are not all that bandwidth intensive by PCIe standards, yet the cards to do them still take up valuable slots. In this sort of scenario, trading off slots for daisy-chainable Thunderbolt devices is a big win.
So while a system with a couple of PCIe slots and four Thunderbolt ports isn’t the best possible Mac Pro Apple could build for every pro customer, I think it really would be better in absolute terms for the vast majority of their pro customers than any previous Mac Pro.
[Walter absolte] “I do understand that I’m a niche within a niche, but I used to be all Mac, and now I don’t feel that I could be anymore even if I wanted to be.”
I’m not going to tell you otherwise, and I don’t think Apple would either. The truth is, the Mac Pro, though a sort of inertia of just putting in the latest increasingly powerful parts and not really thinking about the scope of the product, was, I think, carrying Apple into rarified high-end niche territory that the company hadn’t traditionally played in. That was quite useful to some people, but I don’t think anyone should be too shocked to see it ‘corrected’.
Then again, maybe I’m crazy, and the move upmarket was completely deliberate, and Apple’s next Mac Pro is going to be the most beastly thing they’ve ever shipped, with a $5000 starting price.
[Walter Soyka] “Soyka’s Law: Expectations rise at the same rate as capabilities.”
I think expectations rise faster than a naive view might expect them to, and for a very long time they did rise at the same rate as capabilities, but they’ve started falling behind now. This is why the average sale price of personal computers has crashed over the last decade (Apple has resisted this pretty well, but largely through means unrelated to performance), it’s why people are now more willing to trade off performance for portability than they used to be, and it’s why people now use commodity systems for many professional tasks that they used to buy expensive specialty gear to perform.
—
Digital Workflow/Colorist, Nice Dissolve.You should follow me on Twitter here. Or read our blog.
-
Walter Soyka
October 30, 2012 at 12:15 am[Walter Soyka] “There would be people who were well-served by the 2006, 2008, and 2010 Mac Pros at their respective launches who would be ill-served by a 2013 with limited expansion.”
[Chris Kenny] “I’m just not sure that’s really true, at least not to a large extent. For instance, you could stick multiple graphics cards in a 2008 Mac Pro at launch, but graphics cards are now so much faster than they were in 2008 that whatever you were doing with multiple cards then can probably be done with a single card now. Maybe there might be some hit over the immediately previous generation, but then again, maybe not with NVIDIA playing with ideas like sticking two GPUs on a single card (the GTX 690, not that there will be enough internal power connectors in any Mac Pro for one of those).”
I was talking about the suitability of the machines at their respective launches — in other words, a 2008 Mac Pro in 2008. A 2008 Mac Pro today is easily outclassed by a modern laptop for most tasks.
[Chris Kenny] “So while a system with a couple of PCIe slots and four Thunderbolt ports isn’t the best possible Mac Pro Apple could build for every pro customer, I think it really would be better in absolute terms for the vast majority of their pro customers than any previous Mac Pro.”
I don’t presume to speak for the majority. I have no idea where they are — video? Audio? Science? What’s the average slot usage? I have no idea.
But it doesn’t have to be an either/or proposition. A Mac Pro with Thunderbolt AND moderate internal expansion is better than either a Mac Pro with Thunderbolt and 2 slots or a Mac Pro with no Thunderbolt and 7 slots.
There’s also the question of how to actually integrate discrete graphics cards with video-carrying Thunderbolt ports on the motherboard — we haven’t touched on that yet.
[Chris Kenny] “I’m not going to tell you otherwise, and I don’t think Apple would either. The truth is, the Mac Pro, though a sort of inertia of just putting in the latest increasingly powerful parts and not really thinking about the scope of the product, was, I think, carrying Apple into rarified high-end niche territory that the company hadn’t traditionally played in. That was quite useful to some people, but I don’t think anyone should be too shocked to see it ‘corrected’.”
Apple made some deliberate up-market moves: Shake, Xserve, and FCSvr come to mind. Their strategy looks a little different today.
[Chris Kenny] “I think expectations rise faster than a naive view might expect them to, and for a very long time they did rise at the same rate as capabilities, but they’ve started falling behind now. This is why the average sale price of personal computers has crashed over the last decade (Apple has resisted this pretty well, but largely through means unrelated to performance), it’s why people are now more willing to trade off performance for portability than they used to be, and it’s why people now use commodity systems for many professional tasks that they used to buy expensive specialty gear to perform.”
Expectations escalation is about much more than price or performance. Look at the client side: as soon as we can accomplish something new, better, or faster, it becomes expected that we will do it. Expectation escalation is not refuted by how commodity systems can now perform specialty tasks — expectation escalation explains how that came to be.
Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog – What I’m thinking when my workstation’s thinking
Creative Cow Forum Host: Live & Stage Events -
Chris Kenny
October 30, 2012 at 2:09 am[Walter Soyka] “There’s also the question of how to actually integrate discrete graphics cards with video-carrying Thunderbolt ports on the motherboard — we haven’t touched on that yet.”
There’s a motherboard on the market already that I believe solves this problem just by having DisplayPort input on the motherboard, to which you simply run a DisplayPort cable from the graphics card. That’s a simple enough solution, and with a case designed with it in mind it wouldn’t have to be ugly.
Another possible option is to just include some halfway respectable on-board graphics and use that video signal on Thunderbolt. This would also give Apple the option to offer versions of the system without a discrete graphics card at all, which makes a lot of sense if there’s a new form factor that’s rack-mountable and is intended to also fill the gap left open by the Xserve’s cancelation. It would be nice for Resolve too, since while Resolve runs pretty well now with one GPU, it still likes two GPUs more; a moderately decent on-board GPU that could be used to drive the GUI would free up the ‘real’ GPU completely for CUDA processing without tying up a slot.
Of course doing things that way would mean you couldn’t use a Thunderbolt screen if you wanted it directly attached to your fastest graphics card, but the main appeal of Thunderbolt for screens is that you can hook up the screen and a bunch of peripherals plugged into it with a single cable; that’s cool for laptops, not so exciting for towers.
[Walter Soyka] “Expectations escalation is about much more than price or performance. Look at the client side: as soon as we can accomplish something new, better, or faster, it becomes expected that we will do it. “
But new things take a while to show up. Color grading used to be a ‘heavy iron’ task, now in most cases it doesn’t need to be. It’s not like everyone said ‘Well, sure, we can now get 10 nodes in real time on commodity hardware, but we’re going to stick with $200K specialty systems because now we want 40 nodes on every shot”.
Or look at the progression of standard resolutions. There’s some minor pressure to move some productions 4K finishing, but almost nobody seems to want to pay what it would cost, even though 4K now is probably cheaper than 2K was five or six years ago. You don’t see crashing prices in a market where users are being under-served; in that market, they’re willing to pay more for more capability. You see crashing prices in a market where users are being over-served.
There’s some interesting discussion of this here:
Disruption (low-end or otherwise) happens when a product over-shoots the market. It makes sense to compete on a new basis, be it low price or convenience or customization, if the prevailing basis of competition has led the prevailing products to be more than good enough. If you look through all the examples of low-end disruption, you’ll find that the incumbents were motivated to flee up-market and to continue to improve their products even though they exceeded the demands and expectations of mainstream buyers.
This article is about the iPad, but the above paragraph applies almost perfectly to the post production hardware/software market. The whole reason you’re seeing things like Mac versions of Resolve and Smoke in the first place is because the pricy turn-key versions were overshooting the market, and were vulnerable to low-end disruption.
And I suspect this is a one way process. That is to say, I don’t think users usually move back upmarket, even when new capabilities arise that they could exploit earlier by doing so. For instance, let’s say someone perfects machine vision technology and designs some software that can analyze footage, understand the scene in full 3D, and let you virtually move lights around. That would be pretty cool, it’s something we could imagine happing in the next decade or so, and it would require a lot more processing power than today’s color grading software. But I predict that if, when this technology hits, it requires expensive specialty hardware, practically nobody who has gotten used to finishing projects on cheap commodity hardware will use it… until it’s possible to do on cheap commodity hardware.
—
Digital Workflow/Colorist, Nice Dissolve.You should follow me on Twitter here. Or read our blog.
-
Walter Soyka
October 30, 2012 at 2:28 am[Chris Kenny] “There’s a motherboard on the market already that I believe solves this problem just by having DisplayPort input on the motherboard, to which you simply run a DisplayPort cable from the graphics card. That’s a simple enough solution, and with a case designed with it in mind it wouldn’t have to be ugly.”
That’s cool!
[Chris Kenny] “But new things take a while to show up. Color grading used to be a ‘heavy iron’ task, now in most cases it doesn’t need to be. It’s not like everyone said ‘Well, sure, we can now get 10 nodes in real time on commodity hardware, but we’re going to stick with $200K specialty systems because now we want 40 nodes on every shot”.”
You are looking at high-end feature trickling down, but you are not acknowledging any growth of the high-end.
It was only a few years ago that only the very high end productions got any kind of color grading. As the capability of color grading is sliding down the market through more affordable tools, the expectation that mid-end and soon (if not now) that low-end productions would be graded is rising to meet it.
Likewise, the capabilities in the high-end of color grading (convergence with vfx and compositing, evidenced in technologies and techniques like 3D geometry-based re-lighting) has become the high-end expectation.
Put another way, there’s no such thing as a free lunch. You, as a content producer, cannot solely capture the benefit (like cost savings or quality improvements) of a new capability. You must pass it on to your client. If you do not, efficient markets will punish you and reward your competitors who do pass on the benefit.
Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog – What I’m thinking when my workstation’s thinking
Creative Cow Forum Host: Live & Stage Events -
Chris Kenny
October 30, 2012 at 3:00 am[Walter Soyka] “Likewise, the capabilities in the high-end of color grading (convergence with vfx and compositing, evidenced in technologies and techniques like 3D geometry-based re-lighting) has become the high-end expectation.
“I don’t know that these things have become the high-end expectation, though. As far as I can tell they’re still used on relatively few projects even at the high-end.
It might clarify things to look at image resolution, since it’s more objective. A substantial fraction of Hollywood films are still finished at 2K, even through 4K now is undoubtably cheaper than 2K was when DI started becoming standard. Why would this be so? I think it’s fairly clear that it’s because capability really is advancing faster than expectations. Put another way, given two systems with similar capabilities and performance, one that works in 4K and one that tops out at 2K, the former would be over-serving a large fraction of the potential market.
[Walter Soyka] “Put another way, there’s no such thing as a free lunch. You, as a content producer, cannot solely capture the benefit (like cost savings or quality improvements) of a new capability. You must pass it on to your client. If you do not, efficient markets will punish you and reward your competitors who do pass on the benefit.”
Well, ultimately this results in a larger quantity of cheaper, better content being available to consumers, but more innovation in distribution is probably required to make this materialize. Until then, I think various market participants are in fact managing to capture value they wouldn’t in a perfectly efficient market.
—
Digital Workflow/Colorist, Nice Dissolve.You should follow me on Twitter here. Or read our blog.
-
Walter Soyka
October 31, 2012 at 5:58 pmI am not arguing that a technology is universally adopted the instant it becomes possible, and I am not arguing that expectations are always about a specific technical capability.
[Chris Kenny] “It might clarify things to look at image resolution, since it’s more objective. A substantial fraction of Hollywood films are still finished at 2K, even through 4K now is undoubtably cheaper than 2K was when DI started becoming standard. Why would this be so? I think it’s fairly clear that it’s because capability really is advancing faster than expectations. Put another way, given two systems with similar capabilities and performance, one that works in 4K and one that tops out at 2K, the former would be over-serving a large fraction of the potential market.”
This argument mixes micro-level capabilities (some facilities are capable of 4K production) with macro-level expectations (the entire industry wants 4K production). When you compare capabilities and expectations at the same level, this disparity doesn’t exist.
This argument is also restricted to technical capabilities with no regard to pricing, which disconnects it from the real world. Capabilities and expectations are always expressed relative to price. Given infinite money to spend, you can have virtually any capability you want. With finite money to spend, your capabilities are limited, and expectations are similarly tempered.
Think of the project triangle: quality, speed, price. In the real world, consumers do not measure their expectations on a single dimension. They expect a certain quality at a certain price in a certain timetable. The good-enough principle is totally compatible with the idea that expectations rise at the same rate as capabilities. At some point, the quality will become good enough, and the capability growth and expectation escalation shift to price and schedule.
Let’s use your example of resolution. Think of SD and HD. As the industry became capable of HD production at ever-lower price points, the expectation of HD production increased. Today, there is practically no SD-only production.
Resolution is interesting to study, because resolution increases step-wise (we jump discontinuously from one raster size to another), whereas Moore’s Law ultimately suggests that computational power will increase exponentially. This means that as the time from any resolution increase progresses, computers will become drastically more powerful. In isolation, this would support your theory that capabilities are rising faster than expectations (computers get faster while resolution remains constant), but look at what actually happens industry-wide after resolution jumps: expectations grow, too, in the form of reduced budgets and compressed schedules.
If we can agree that expectations and capabilities both have rates of change, there are only three possibilities for the difference in those two rates at any given instant: expectations grow faster, capabilities grow faster, or they grow at the same rate.
What if expectations grew faster than capabilities? Expectations can’t reasonably grow faster than capabilities; without the capability, there would be no justification for the expectation.
What if capabilities grew faster than expectations? This would result in increasing surplus of some kind for the producer, ultimately leading to increasing income or decreasing production time. (There are not too many posts on these forums suggesting that people have larger budgets than they know what to do with or longer schedules than they need.) Customers would not see better, cheaper, or faster products, because the benefits would be captured upstream.
What if capabilities and expectations grew at the same rate? Producers and consumers, through a functioning market, could get better products cheaper, better products faster, or cheaper products faster. Basically, as something desirable becomes feasible to produce at a low-enough cost, the market demands it, and competition squeezes out excessive profit, passing the savings on to consumers.
Ever-better products at ever-lower prices? That is exactly what is happening today (whether you’re talking about the productions we work on for our clients, or the computers and technology we buy from hardware vendors, or any other functioning market). Expectations are rising at the same rate as capabilities.
Walter Soyka
Principal & Designer at Keen Live
Motion Graphics, Widescreen Events, Presentation Design, and Consulting
RenderBreak Blog – What I’m thinking when my workstation’s thinking
Creative Cow Forum Host: Live & Stage Events
Reply to this Discussion! Login or Sign Up