mNo edit summary
(few minor edits)
Line 4: Line 4:


READY.
READY.
That is the prompt the Commodore 64 gives once it has been turned on, immediately it is there to be programmed. Most often, the simple instruction to LOAD something was typed, bypassing this environment to play games or load other software, yet there was that moment of exposure, that provided that moment of curiosity that myself and many others found in getting into programming. We could tell the computer to do stuff.
That is the prompt the Commodore 64 gives once it has been turned on, imediatly it is there to be programmed. Almost often, the simple instruction to LOAD something was always typed, bypassing this environment to play games or load other software, yet there was that moment of exposure, that provides that moment of curiosity that myself and many others found in getting into programming. We could tell the computer to do stuff.


Computing environments are very different now, and that is mostly a good thing. From a usability perspective, no longer does anyone need to type in seemingly archaic instruction to load a program or file of a disk, it’s all delivered through an intuitive mouse pointer and windows. This has been further abstracted with iPhone, iPad and other touch screen mobile computers; no longer do we need to be aware of the technical underpinnings of how our computers are working when we’re using them. Touch the icon and the app we want launches with the data (that is, photos, status updates, movies) we want to work with.  
Computing environments are very different now, and that is mostly a good thing. From a usability perspective, no longer does anyone need to type in the seemingly archaic instruction to load a program or file of a disk, it’s all delivered through intuitive mouse pointer and windows. This has been further abstracted with iPhone, iPad and other touch screen mobile computers, no longer do we need to be aware of the technical underpinnings of how our computers are working when we’re using them. Touch the icon and the app we want launches with the data (that is, photos, status updates, movies) we want to work with.  


But I wonder if something has been lost, that initial magical exposure. Computers no longer boot up into a programming language, coding tools aren't typically installed.  David Brin talks about the simple ‘type it in’ programs in maths books (and magazines as I remember) that can’t be typed in anymore, a frustration in teaching his son the fundamentals of programming languages. [1] While Apple has taken care to curate a developer eco system, this is completely unexposed to non-developers (while you can get the SDK, you can’t develop on your own iOS device without payment to Apple or Jailbreaking). Even Android is not immune to this lock down. The largest mobile computing (device based) platform will be made largely of free software, but we won’t be able to modify it due to locked down hardware.[2]
But I wonder if something has been lost, that initial magical exposure. Computers no longer boot up into a programming language, coding tools aren't typically installed.  David Brin talks about the simple ‘type it in’ programs in maths books (and magazines as I remember) that can’t be typed in anymore, a frustration in teaching his son the fundamentals of programming languages. [1] While Apple has taken care to curate a developer eco system, this is completely unexposed to non-developers. (while you can get the SDK, you can’t develop on your own iOS device without payment to Apple or Jailbreaking). Even Android is not immune to this lock down. The largest mobile computing (device based) platform will be made largely of free software, but we won’t be able to modify it due to locked down hardware.[2]


Overall this may provide a better user experience in terms of being able to use a device that just works (even geeks want to go to Disneyland - Hackers and Painters). But it loses an exposure point, particularly when these sort of devices become people's primary computers. With no immediate access to a computer that can be programmed, how do we encourage the recognition that ''your'' app can be there too.
Overall while this may provide for a better user experience in terms of being able to use a device that just works, it’s said in a way in that there is a loss of an exposure point. This will be particularly worrisome when these sort of devices become peoples primary computers with no immediate access to a main computer, how do we expose the curiosity that your app can be there two.


I think the web has always had a much stronger emphasis on freedom than on device platforms. The barrier to entry is low, providing you're willing to learn the knowledge, crack open a text editor, write your content and upload to a web server; and it becomes available on a multitude of devices. There definitely seem to be stronger online communities around web programming technologies. Mash ups and other services still expose some of the underpinnings, even text forms on the web will often allow and expose underling html.
I think the web has always had a much stronger emphasis on freedom than on device platforms. The barrier to entry is low, providing your willing to learn the knowledge, crack open a text editor, write your content and upload to a web server and it’s available on a multitude of devices. There definitely seems to be stronger online communities around web programming technologies. Mash ups and other services still expose some of the underpinnings, even text forms on the web will often allow and expose underling html, Facebook has a custom html application, Twitter accepts colour codes for custom profiles.  


This curiosity is key to encouraging the next generation of programmers, and while this will continue to be abstracted away, hidden, providing these underpinnings are still available and accessible, while ‘view source’ still exists on a web browser, that magical curiosity of making the computer do something will still be exposed.
Exposing this curiosity is key to encouraging the next generation of programmers. While this will continue to be abstracted away, hidden, it is essential that these underpinnings are still available and accessible. Providing ‘view source’ still exists on a web browser, that magical curiosity of making the computer do something will continue to provide a spark to the next generation of programmers.


The future of Programming is the Future of the Web.
The future of Programming is the Future of the Web.
Line 20: Line 20:
1. David Brin - Why Johnny can’t Code.  
1. David Brin - Why Johnny can’t Code.  
http://www.salon.com/technology/feature/2006/09/14/basic
http://www.salon.com/technology/feature/2006/09/14/basic
2. Tony Mobily - 10 years on: free software wins, but you have nowhere to install it
2. Tony Mobily - 10 years on: free software wins, but you have nowhere to install it
http://www.freesoftwaremagazine.com/columns/10_years_free_software_wins_you_have_nowhere_install_it
http://www.freesoftwaremagazine.com/columns/10_years_free_software_wins_you_have_nowhere_install_it

Revision as of 01:28, 14 September 2010

Future of Programming

READY. That is the prompt the Commodore 64 gives once it has been turned on, imediatly it is there to be programmed. Almost often, the simple instruction to LOAD something was always typed, bypassing this environment to play games or load other software, yet there was that moment of exposure, that provides that moment of curiosity that myself and many others found in getting into programming. We could tell the computer to do stuff.

Computing environments are very different now, and that is mostly a good thing. From a usability perspective, no longer does anyone need to type in the seemingly archaic instruction to load a program or file of a disk, it’s all delivered through intuitive mouse pointer and windows. This has been further abstracted with iPhone, iPad and other touch screen mobile computers, no longer do we need to be aware of the technical underpinnings of how our computers are working when we’re using them. Touch the icon and the app we want launches with the data (that is, photos, status updates, movies) we want to work with.

But I wonder if something has been lost, that initial magical exposure. Computers no longer boot up into a programming language, coding tools aren't typically installed. David Brin talks about the simple ‘type it in’ programs in maths books (and magazines as I remember) that can’t be typed in anymore, a frustration in teaching his son the fundamentals of programming languages. [1] While Apple has taken care to curate a developer eco system, this is completely unexposed to non-developers. (while you can get the SDK, you can’t develop on your own iOS device without payment to Apple or Jailbreaking). Even Android is not immune to this lock down. The largest mobile computing (device based) platform will be made largely of free software, but we won’t be able to modify it due to locked down hardware.[2]

Overall while this may provide for a better user experience in terms of being able to use a device that just works, it’s said in a way in that there is a loss of an exposure point. This will be particularly worrisome when these sort of devices become peoples primary computers with no immediate access to a main computer, how do we expose the curiosity that your app can be there two.

I think the web has always had a much stronger emphasis on freedom than on device platforms. The barrier to entry is low, providing your willing to learn the knowledge, crack open a text editor, write your content and upload to a web server and it’s available on a multitude of devices. There definitely seems to be stronger online communities around web programming technologies. Mash ups and other services still expose some of the underpinnings, even text forms on the web will often allow and expose underling html, Facebook has a custom html application, Twitter accepts colour codes for custom profiles.

Exposing this curiosity is key to encouraging the next generation of programmers. While this will continue to be abstracted away, hidden, it is essential that these underpinnings are still available and accessible. Providing ‘view source’ still exists on a web browser, that magical curiosity of making the computer do something will continue to provide a spark to the next generation of programmers.

The future of Programming is the Future of the Web.

1. David Brin - Why Johnny can’t Code. http://www.salon.com/technology/feature/2006/09/14/basic 2. Tony Mobily - 10 years on: free software wins, but you have nowhere to install it http://www.freesoftwaremagazine.com/columns/10_years_free_software_wins_you_have_nowhere_install_it

Cookies help us deliver our services. By using our services, you agree to our use of cookies.