Has Apple actually slowed the rate of technological Progress over the last 3 years?
I ask, because as I look at my receipts over the last several decades, 80's, 90's, 00's, I notice one thing...
Over the last 3 years, the pursuit of "Green", thin, small, portable, flash-based and interconnected Apple iPhone/iPad competition has actually slowed the pace of technological innovation overall...
We are still buying tiny 2TB 7200rpm drives after 3 years, instead of 15,000 rpm 5TB drives? Hard drive warranties went from 5 years to 90 days on some models? Quad CPU desktops are nowhere to be seen anymore... More green low-power cores are preferred to the fun raw Megahertz races? I have a 3 year old Acer laptop with Blu-ray player and 1080p 18.4" Screen, that smokes upcoming 2012 Ultrabooks? I used to get UNLIMITED voice and Data for $60, and NOW In the US, we pay big bucks for low-speed wireless data with GB caps! $30 Paid basic cable has standard definition VIDEO, which is actually WORSE than free 1080 over the air signals! Apple Thunderbolt over copper with few devices instead of Light Peak over optical? SD association released UHS-2 standard a year ago, and nobody has released a single device!
Don't get me wrong, I appreciate Apple-inspired price drops on small SSD's and RAM and social media, but that's about the only residual benefit as far as I can tell, which I think would have come along anyway.
Am I the only guy that would prefer the glory days when Progress meant things got faster, bigger and ate more power and space, when Full Tower cases included Power supplies, motherboards supported quad CPU's, Tube Monitors looked great and weighed a ton (My Viewsonic 17" tube hasn't quit from 1993, I sold my 21" Cad monitor a decade ago), and 18.4" Power Laptops ate batteries in a single DVD showing?
just a random thought for 2012 on my 3 year old 1080p BlueRay laptop bought from Meijers for under $1000...
PS. Apple haters need not bash, I am commenting on general technology progress that chases Apple's direction (and profit margin) ... no dislike intended for Steve Jobs or Apple product users.
I think I wrote my opinion already http://www.personal-view.com/talks/discussion/1994/camera-rumblings
And mentioning Apple will turn it into flame. For certain.
One more addition.
Apple is really not the cause.
But indicator.
Elites are making consumers.
And are not interested in creators (hence big shift in all the tools and media coverage). Under creators I do not mean musicians, filmmakers, and painters. Quite the opposite.
It seems like you're lamenting that battery technology has increased as processor technology has. A lot of cool stuff is happening, like tri-gate transistors, fpga's, multiple wafer chips. Things haven't just doubled in MHz each year because there are a lot of other ways of measuring how much work a computer processor can do other than clock speed and heat, yield, and quantum tunneling are just a few of the problems that one runs into with smaller semiconductor processes. Just 25 years ago we were on a 1 micrometer process, now we're on, what, 22nm? 16nm? It's awesome!
If you want things that are just 'bigger,' feel free to enjoy IPv6 and 64bit memory architecture. How much of a pain was it to have files only as large as 4GB? How much is it still a pain with FAT32 memory?
I don't think Apple had much to do with this, and you can happily buy 15k RPM drives, but who needs that kind of seek time? What people did want were larger hard drives, and larger they've gotten.
@csync If my timeline is correct, shouldn't we all be using 128 bit processors and OS right now if we just stayed in line with past development cycles?
Instead, in 2012 we'll get "touch" Windows 8 for touch cell phones/pad devices with arm processors? That's as backwards progress as I've seen yet in 30 years for supposed technological progress, all so that kids can be sexting each other in real-time on thin cellphones? ...
;-)
I somehow have to agree... the progress is slow... I just remember that I had a high end IBM laptop in 2004 and got it replaced 4 years later with a new one... well.. maybe more powerful by 20%? ... not much...
Now its again already 4 years that I have that one... will get a replacement again... but they are faster by 10-15%?
Have bought an i7 2600K almost a year ago... now their most powerful CPU is faster by... hmm... 10% ? .. and it is because it has 2 more cores and is considered high end, while 2600K is mid-end ... so where's the progress?
On the other hand, once you get an iPhone... it just works... you don't need to hassle with it everytime... there are newer models out now... but I still have my old one, which can do 90% of the new model.... so I don't feel I need to upgrade...
Anyway... the pace is slow... nothing extraordinary, nothing breathtaking is being developed...
These things can go as fast as you want, but unless you're prepared to purchase another $1000 worth of cooling, then you better be appreciative of the leaps and bounds they are making squeezing performance out of a small amount of W/h.
The biggest bottleneck in our systems are the storage. However, even though the tech is there (SSD), the price isn't. Why invest billions in efficient consumer cooling of 15k drives, when SSD is on the brink of ubiquity, and subsequent price falls?
I'm currently running a 27" IPS LCD with a resolution of 2560x1440 that only cost me £350. Now 'that' is progress!
There's no money in that old way of things. Everything has changed including the way people look at technology. More and more it's not tech heads but regular people buying and using tech but they aren't power users! They just use tech for social networking and everyday stuff that doesn't really require massive power. It was inevitable.
@NickBen @Aria I absolutely agree with what you're saying. Isn't it basically that most people using computers up until quite recently, were the pioneers and the creative ones, and now we have shifted to a mass-market where most people using this technology just want an "appliance" that they don't need to understand?
Not saying that's desirable, just how it is.
@Mark_the_Harp Yes, and that's what people started saying about the Internet around 10 years ago...
Social networking and the internet are just more mainstream now... and since the masses are a bunch of dumb people, computers seem to be making more progress in "usability" (or dumbing down) to appeal to more of the stupid-class (aka most everyone).
Just my 2cents... :)
Computers long ago surpassed the needs of the average user needing word processing and internet. All the power users are in the minority. Let's remember that the original intent of the PC was for the common man to have easier use and access. That was the plan and purpose behind all the GUI innovations. Kids don't even care about computers per se. They just want a devise that can quickly give them access to all the social networks, GPS, taking photos and videos. Then sharing them. Mobile CPU's can handle that with not much effort.
I don't think progress is slow, it just has been focused on small devices that need long running times on battery. This is what the large market demands. Us "geeks/nerds" get the feeling of slow progress... Al
Don't have any particularly strong opinions/observations about this, but I never would've expected my 2008 8-core MacPro to still feel more or less top of the line in 2012. Not saying it's the fastest machine you can buy, but in the 90s it would've felt almost unuseable at this point relative to changing software demands.
Intel's early 4-core CPU's used separate chips for each core, which allowed all four cores to run at max speed. Their current designs have shrunk the cores to fit on a single chip, which reduces power dissipation and improves internal bus speed. However, Intel had to implement a governor on max core speed to prevent excessive power dissipation. As a result, you can only run one or two cores at max speed. With a fully multi-processed application that uses all available cores, clock speed must be reduced to prevent overheating the chip.
Small article on the subject
Much commentary about the American economy nowadays leaves the impression that economists should fix its problems. But Washington is teeming with smart economists, and the problems remain.
An economy is like a cloud: only when inside does one realize how diffuse it is – and that what matters are the particles of vapor that it comprises.
Likewise, an economy is an accumulation of transactions involving goods and services, mostly carried out by business enterprises. Their behaviors are what matters, and they cannot be adequately perceived from the distant perspective of economic models and statistics, but only on the ground – where an economy is built, where it breaks, and where it must be fixed.
On the ground, there are two kinds of enterprises: those that rely on exploration, and those that rely on exploitation. Every economy has both, but a healthy one favors the explorers. This fosters the sense of enterprise that made the United States such an economic powerhouse. Unfortunately, the American economy now favors the exploiters.
Economic development proceeds through a cycle that begins with young, exploring enterprises introducing new products, services, and processes. Over time, however, as they succeed, many explorers become exploiters. They saturate their markets, run out of new ideas, and get lazy. They then extend their product lines instead of developing new products; cut costs by putting pressure on their workers; lobby governments for favorable treatment; merge with competitors to reduce competition; and manipulate customers to squeeze out every last penny.
This, of course, makes these enterprises vulnerable to the creative challenges of the next wave of explorers – the fast new firms that confront the fat old corporations – and the cycle of destruction and reconstruction begins anew.
Contrast this with the America of bailouts, where the fat are considered “too big to fail.” In fact, many are too big – or at least too mismanaged – to succeed. How else to explain why major banks and insurance companies bet their futures on mortgages that a little investigation would have shown to be junk? Their senior managers either didn’t know, or cynically thought that they could get away with it, while the rest of their managers either didn’t care, or couldn’t get through to their bosses.
This American problem goes far beyond the bailouts. For every Apple and Google – explorers par excellence – count the energy companies with their cozy tax deals, the defense contractors that live off government budgets, and the pharmaceutical companies that buy their innovations and price what the market will bear, thanks to patents that governments grant, but without policing their holders.
On top of this, many US startups now leap into exploitation. Whereas America’s entrepreneurs had traditionally been inclined to create sustainable legacies, now many of them strive for an early IPO that will let them cash out quickly. This can be terribly dysfunctional, cutting off much of what still must be learned.
When economists boast about America’s great productivity, what they have in mind is exploration – finding ways to do things better, especially through superior processes. But much of this “productivity” has in fact been destructively exploitative. Think of all the corporations that have fired great numbers of people at the drop of a share price, leaving behind underpaid, overworked employees and burned-out managers, while the CEOs escape with their bonuses.
To see where this leads, imagine a company that fires all of its workers and then ships its orders from stock. Economic statistics would record this as highly productive – until, of course, the company runs out of stock. American enterprise is running out of stock.
Seen in this way, there is no quick fix for America’s current economic problems. Firing workers or even printing money can be easy; changing dysfunctional behaviors is not. The US economy will have to be fixed by its enterprises, one by one, on the ground. Attitudes will have to change, and this will demand great dedication and patience – traits that seem to be in short supply in the US today.
The place to start is America’s executive suites, which should be cleared of mercenaries in order to encourage real leadership. That is the easy part: get rid of the obscene compensation packages and watch the mercenaries disappear. People who care about building and sustaining decent enterprises – and who understand that doing so is a team exercise – can then take over.
Successful enterprises take time to create – time spent on inventing better products, serving customers more effectively, and supporting workers in ways that enhance their commitment. Symbols matter, too: the term “human resources,” for example, should be retired, because a great enterprise is a community of engaged human beings, not a collection of detached capital.
Public support should be shifted from protecting large established corporations to encouraging the growth of newer enterprises. And startups should be discouraged from rushing into the embrace of the stock market’s short-sighted analysts (and many an established corporation should be encouraged to escape that embrace). At the same time, regulation and taxation should be used to rein in disruptive day trading and other exploitative speculation that crowds out sustainable investment and disrupts regular business activities.
Above all, what the American economy needs now are managers who know and care about their businesses. Armies of MBAs who have been trained to manage everything in general but nothing in particular are part of the problem, not the solution. So are economists who study clouds without ever getting wet.
Via: http://www.project-syndicate.org/commentary/mintzberg3/English
"Economic development proceeds through a cycle that begins with young, exploring enterprises introducing new products, services, and processes. Over time, however, as they succeed, many explorers become exploiters. They saturate their markets, run out of new ideas, and get lazy."
Yep. This is the transition from capitalism to collectivism...
Also the SBA (Small Business Administration) which is supposed to help small business does almost NOTHING. No one is helping startups anymore. If you're not talking big dollars, they don't even want to waste any time on you. There used to be more small banks, willing to take a chance, but not anymore.
Yep. This is the transition from capitalism to collectivism...
You see this transition in almost anything. I am sure it is also present in the freezing water :-)
@bwhitz They're both phases of capitalism - hierarchical corporate structures minimize any "collectivist" benefit to the exploitation phase (other than among shareholders). I think the terms you're looking for are "entrepreneurial" versus "profiteering".
It looks like you're new here. If you want to get involved, click one of these buttons!