Developers are faced with more choice in platforms today than ever before. There are social networks like Facebook, MySpace, LinkedIn, and OpenSocial. There is the iPhone SDK, Google’s Android, and RIM's BlackBerry. There are communications tools like Twitter, Jaiku, and Pownce.
There is the choice of development languages, and associated tools. Do you choose Java? Ruby on Rails? Python? PHP? Flash? Django? .NET?
How about cloud computing infrastructures? Amazon? Google AppEngine? Do you risk some unknown startup?
And the computing platform? Windows, Mac, Linux, or some combination? Or web only?
So many choices. How does a developer decide?
The easiest way is to follow the money. That certainly drove development on the Windows platform twenty years ago. But what if there is no obvious money?
In a world of free products, you’ll need to choose the path to the greatest reward. That usually means the way the gets you product or service in front of the greatest number of people, as quickly as possible. For multi-platform support, I will choose Java.
A year ago when the Facebook Platform was announced I could see that Facebook was growing exponentially. I could see the potential for getting web-based applications in front of groups quickly, and virally, so it was a simple decision to learn that platform. That decision in turn drove me to learn PHP and MySQL, since the platform provided a PHP API.
Today I'm learning Ruby on Rails, as I’ve seen how quickly it can be used to build a web application. However, I’m also keenly aware from the experience of Twitter, that it may not be the most scalable choice. However, I'm not building anything that big yet, and time-to-market is of greater concern.
And like many others, I have the iPhone SDK sitting on my laptop, though so far it is untouched. I’m in India, which doesn’t have the iPhone currently, but I have an iPod Touch, and I can already see that this is going to be an important platform. So the iPhone is next.
Back before the days of the Windows platform it was like the wild west. Every application looked different, and acted differently as well. Windows standardized everything so you only had to create one product, and all of the products on the platform behaved similarly.
Today we have much greater choice of tools to use, but we’re back on the frontier again. A different platform for every tool, and precious little standardization, except for efforts by Google such as Android and Open Social. And a new platform announced virtually every week.
Given a finite limit of development resources, maybe we should stop picking platforms and start pushing for some standardization. Is it really necessary to have so many different ways of working with so many products that, at their core, do much the same thing?
There is the choice of development languages, and associated tools. Do you choose Java? Ruby on Rails? Python? PHP? Flash? Django? .NET?
How about cloud computing infrastructures? Amazon? Google AppEngine? Do you risk some unknown startup?
And the computing platform? Windows, Mac, Linux, or some combination? Or web only?
So many choices. How does a developer decide?
The easiest way is to follow the money. That certainly drove development on the Windows platform twenty years ago. But what if there is no obvious money?
In a world of free products, you’ll need to choose the path to the greatest reward. That usually means the way the gets you product or service in front of the greatest number of people, as quickly as possible. For multi-platform support, I will choose Java.
A year ago when the Facebook Platform was announced I could see that Facebook was growing exponentially. I could see the potential for getting web-based applications in front of groups quickly, and virally, so it was a simple decision to learn that platform. That decision in turn drove me to learn PHP and MySQL, since the platform provided a PHP API.
Today I'm learning Ruby on Rails, as I’ve seen how quickly it can be used to build a web application. However, I’m also keenly aware from the experience of Twitter, that it may not be the most scalable choice. However, I'm not building anything that big yet, and time-to-market is of greater concern.
And like many others, I have the iPhone SDK sitting on my laptop, though so far it is untouched. I’m in India, which doesn’t have the iPhone currently, but I have an iPod Touch, and I can already see that this is going to be an important platform. So the iPhone is next.
Back before the days of the Windows platform it was like the wild west. Every application looked different, and acted differently as well. Windows standardized everything so you only had to create one product, and all of the products on the platform behaved similarly.
Today we have much greater choice of tools to use, but we’re back on the frontier again. A different platform for every tool, and precious little standardization, except for efforts by Google such as Android and Open Social. And a new platform announced virtually every week.
Given a finite limit of development resources, maybe we should stop picking platforms and start pushing for some standardization. Is it really necessary to have so many different ways of working with so many products that, at their core, do much the same thing?
1 comments:
May 22, 2008 at 7:23 AM
Have you seen Seaside (http://seaside.st), the web-continuation based framework written in portable Smalltalk? It's really catching the attention of many current (and a few now former) Rails users.
Post a Comment