Tuesday, 21 December 2010

Once upon a time, there was KDE

KDE: your UNIX easy

My first Desktop Environment on GNU/Linux was KDE 2, at university laboratory. At that time I didn't have a computer powerfull enough to double-boot Windows 95 and a distro, but I read a lot, fascinated by FOSS philosphy.
I was KDE fanboy: I am still convinced that big projects as a desktop environments must be programmed with an object-oriented language. KDE uses C++, while Gnome works in C. This difference was visible in first years of 2000: KDE was fast, fascinating and colorful, while Gnome and its applications were gray and old-looking.
When KDE3 was released I run it on my new Pentium IV 1200 with 512MB, looking admired that environments, far advanced from Windows XP. Many friends of mine were surprised by KDE3, its themes and its applications (Kopete, Konquerror, Kircand KOffice), making it very near to Mac OS X.
On 2002, the Liquid theme (made by Mosfet), was a wonder.

KDE 4 Revolution

Since Matthias Ettrich foundation, KDE was a well engeneered project. Ettrich did many analysis to optimize user experience, expecially about memory use.
KDE 4 was a "revolutionary" project: its developers wanted to change the usual "desktop paradigm", introducing a engine which runs many "plugin" (called "plasmoids"). You can still have a "desktop" with your folders: it's a "plasmoid" which will show you your $HOME/Desktop folder big as your monitor.
KDE4's underlaying platform is a programming masterpiece: there are base libraries well organized (Phonon, Solid, KIO, Plasma, KParts and others) and well integrated all togheter, but... take a look to these two screenshots





KDE 4 Enlightenment 17

Well... there's something not very clear to me: Enlightenment 17 is a "yet not finished" project, but you can test it. It's written in C and it's very fast and light. So light to move some producers of embedded devices to run E17 on their products.
KDE4 is big, heavy and does E17 same things.
Looking to Gnome, I see a lighter DE, fast and nice looking. Its technology is not refined as in KDE, but Gnome does its job very well. It's usable and I can be productive with it. One year ago I tryed both KDE4 and Gnome: after some "Wow! Amazing!" I used Gnome because I can "do things", while KDE4 seemed to me a "useless videogame".

The Future

Gnome is the most widespread DE thanks to its usability. Its Human Interface Guidelines were the secret of its success. KDE, instead, worked too much on its underlying technology, making it «the Java of Desktop Environments»: well designed, well documented, well thinked, not very usable.
KDE has to rethink its structure, moving the user-experience as center of its universe, continuing to host great applications and (maybe) trying to take a diet.

Monday, 20 December 2010

The answer to a Non-Rigid-Organization

Japanese philosophy is so near to computer-science

Do you remember? Some days ago I suggested some ways to do data-entry, but I didn't explain well when a wiki is a good solution. I want to illustrate better this concept and introduce you some cases where a wiki is a comfortable solution.

Wabi-Sabi

What's Wabi-Sabi? In wikipedia we read

Wabi-sabi (侘寂?) represents a comprehensive Japanese world view or aesthetic centered on the acceptance of transience. The aesthetic is sometimes described as one of beauty that is "imperfect, impermanent and incomplete".[1] It is a concept derived from the Buddhist assertion of the Three marks of existence (三法印 sanbōin?), specifically impermanence (無常 mujō?).
Characteristics of the wabi-sabi aesthetic include asymmetry, asperity, simplicity, modesty, intimacy and the suggestion of natural processes.

Do this sounds familiar to you? Concepts like "transience", "impermanence", "incompleteness" can appear very often in a bad organized office and are programmer's hell. Let's think to write a relational database. It's well designed and has good interfaces for CRUD. Now, let's imagine our boss asks us to change the relational schema. Adding or removing a column could be not very difficult, but changing big pieces of this schema means to rewrite our application controller from 40% to 70% and write some scripts to move datas from the old schema to the new. Pretty boring and frustrating. And, more important, probably non definitive, because in a bad organized office, changes could happen very often.

Wiki is Wabi-Sabi

A wiki is "fluid". It have a very simple schema (to our eyes), mostly based on "Articles" and "Categories". It checks if a certain page exists (blue links) or not (red links); it lists all uncategorized pages; it lists all unused categories, ecc. It will also include the "Search" which shows us all pages wich contain the searched word.
I choose a wiki for a particular job: I had a web application and I have to list all entities showed to the user. I started with an excel spreadsheet, listing where the entity was located (in wich web page), if it was read or write, who can modify it and other fields. This organization changes some days ago, forcing me to change the schema. But this time I moved all data to a wiki (mediawiki), making a page (article) for every entity.
Result: it's appreciated and a change will be managed easily, 'cause there's no SQL schemas to modify, just pages.
If you're losing your life on a Access database and your boss wants every day a "small change" which makes you mad, then moving to a wiki is probably the best solution.

Tuesday, 14 December 2010

The Apache which leaved the reunion

Apache is flown away
Me and Mix (a friend of mine) have a very different opinion about Java: he totally dislikes it, but I think it's good on servers.
Two days ago, Apache leaved Java Community Process, because it disagree Oracle's decision to don't release for free tools to check a Java implementation compatibility. This is an ugly decision, because it will make very difficult (or impossibile?) to check if (e.g.) Apache Harmony is a Java compliant implementation.
Oracle is handling Java as a colony, ignoring what the Java Comunity did in these years. Oracle is pressing to morph Java into a "personal language".

When Mix sent me this news, I throws my hand in the sky saying «well, we will program in python or PHP» and I was serious. I used Java in few projects in these years and no-one end successfully. In the low-medium market, where I work and where work at least 70% of programmers, Java is never used because:

  1. it's more difficult to write a JSP than a PHP page
  2. it's more difficult to use Hibernate than Ruby on Rails
  3. Swing is slower than QT or than WXpython or than TKinter
  4. it's harder to use an XML as configuration file than using a python/ruby/php/lua dictionary/hash

Many big competitors use Java for their work and they have many powerfull tools, such as JUnit, Log4J, Ant, Geronimo, Batik, Cayenne, Cocoon and many more, developed by Apache Foundation.

I am tired to talk about Java: Oracle, Apache, IBM and Google are playing with it in a very boring way. C/C++ didn't have all Java troubles because the language and the standard library were free to be developed by everybody. Java was once strictly managed by Sun and now by Oracle. This decision cut off necessary freedom from Java to transform it as "language of choice". People still use C/C++ because they're faster; use Ruby or Python because they're easier; and because people aren't interested to portability.

Maybe Java, as we know it, is going to dead. Maybe we'll see two branches, the one "official" made for servers (supported by Oracle, IBM and RedHat) and the "mobile" one, supported by Apache and Google for Android platform.
Future's unknown. Until the next "big" revolution, let's script.

Friday, 10 December 2010

Never do Data-Entry

«I am sorry if I didn't write these days. Unluckly, I was very busy with a lot of work so I didn't pay much attention to other tutorials. Excuse me. I'll try to write something about next week»
It happened to me yesterday...

Data Entry is bad by design.
It's true, it's important, because without datas, a database is useless. But there are good and bad ways to insert datas. Good ways are:

  1. User-Based: users insert informations on specific designed application
  2. Data-Transfer: datas are moved from a data source (text file, excel, network, ecc.) to a destination database with a script
  3. User-Evolved: it's similiar to the user-based. It's more Zen, because in this model we admit «perfection is impossible, imperfection is normal, evolution is required». Premising we can't insert all datas, we'll delegate users to modify and to refine them. It's how Wikipedia works

Bad ways are (usually) more widespread, because data-entry is made to solve an informations hole without considering how the database will evolve, who will use it and how.

  1. One-Man-One-App: this is the "less worst" solution. A programmer is charged to implement a program to manage a database and to fill this database by himself. Though is still a bad solution, it's a bit more human because the programmer can write a program more suitable for HIS needs. More important, a single error doesn't compromise whole work
  2. Data-Source-Based: absolutely THE worst solution. Two or more users work to fill a common data-source (an excel file, a text file, ecc.)

One-Man-One-App problems are:

  1. A new "data-inserter" must be trained to learn how the hight-customized application works
  2. Move datas to another database more well designed could be difficult
  3. The official data-inserter doesn't always pay necessary attention on data-entry, because he's a programmer. Repetitive jobs make programmers angry and frustrated

Data-Source-Based is bad because
  1. Data-entry manifests in its worst way: "filling" an excel spreadsheet is one of most boring and frustrating activities on earth. Even a 100x6 table is hard to fill with care
  2. "Save button" is "Cumulate-and-Fire" anti-pattern incarnation. If someone forgets to Save, hours of works could be trashed
  3. Finding errors could be hard if you have more than 7 columns
  4. A clumsy operation (e.g. a "sort and save" just on a selection) could make datas inconsistent and ruin irremediably all the work done

If you're ALONE to fill a large database with datas wich came from paper or other non-scriptable sources, try to move yourself to a good way of data-entry. If it's impossibile, NEVER USE data-source-based: create a front-end even if you're working with an excel. This will allow you to have to avoid the "cumulate-and-fire" anti-pattern, will allow you to make a more comfortable way to insert datas and will give you a little fun when coding your script.

Anyway, data-entry is bad and boring. The only good way to do it is spreading it among three or four employed on a long timeline. 10 minutes per-day of data-entry is sopportable: two or more hours per-day doesn't.

Friday, 3 December 2010

iPhone Tutorials - Start with Objective-C


If you want to program on Macintosh or iPhone, first you have to learn Objective-C. Objective-C is very different from C++ or Java, because it follows the "other" way for Object Oriented Programming, the Message Passing.
With message passing, you can easly implement some designs patters, such as the delegate, but you'll see it when I'll talk about UITableViews. First, you'll see how Message Passing is more expressive and readable than C/C++/Java Methods.
Now, let's image to implement a simple (and classic) ComplexNumber class. First, on a file called ComplexNumber.h we'll write the class interface.

//ComplexNumber.h
@interface ComplexNumber: NSObject{
    double real;
    double imaginary;
}

//---constructor
-(id)init;

@property(nonatomic)  double real;
@property(nonatomic)  double imaginary;

//---Add
-(void)Add:(ComplexNumber*)other;

@end

Quiet easy, right? Class' members are defined between braces and methods (or messages) follows immediatly. Class Interface is closed between @interface and @end keywords. id type is a pointer to an unspecified object, something like a "pointer to anything". It's usefull expecially in delegate design pattern.

@properties are "syntactic sugar". They help you to implement easily getters and setters for members. With this trick we'll can write

complex.real=3.0f;

Parameters inside the @property specify how the variable is passed (e.g. by reference or by copy). Because real and imaginary are base-datas, we haven't to specify the way.

A method is defined according with this protocoll

-(type_returned) MethodName:(parameter_type)parameter_name;

So, a method called "HelloWorld" wich returns nothing (void) and wich accepts a NSString* (OpenStep strings) will look like

-(void)HelloWorld:(NSString*)string;

Now, let's create another file called ComplexNumber.m and let's implement all methods

#import "ComplexNumber.h"

@implementation ComplexNumber

//--- implementing properties
@synthesize real;
@synthesize imaginary;

/**
* Constructor
* a constructor always calls first constructor from 
* mother class and returns the object itself. Between
* [super init] and [return self] we set values for
* all object members
-(id)init{
    self=[super init];
    real=0.0f;
    imaginary=0.0f;
    return self;
}


/**
 * Add
 * add param other to this ComplexNumber
 */
-(void)Add:(ComplexNumber*)other{
   real+=other.real;
   imaginary+=other.imaginary;
}
@end

Now, our class is over. Let's trying it on a main.m.

int main(int argc, char** argv){
    ComplexNumber* one=[[ComplexNumber alloc] init];
    ComplexNumber* two=[[ComplexNumber alloc] init];

    one.real=3.2f;
    one.imaginary=1.8f;

    two.real=4.0f;
    two.imaginary=2.2f;

    [one Add:two];

    // NSLog is the OpenStep equivalent
    // of printf. You can declare a NSString* without using
    // a constructor, using the synthax:
    // NSString* s=@"Hello world";
    NSLog(@"Now one is %.1f+i%.1f",
                                one.real,
                                one.imaginary);

    [one release];
    [two release];

    return 0;
}

This is our first Objective-C program: a simple ComplexNumber class. It has many problems, such as it lacks a parametrized constructor (something like c=new Complex(real,imaginary) in C++ or Java) a toString method and a method to set real and imaginary in one shot. I shall show you how improve this class in next tutorial.

Wednesday, 1 December 2010

Tutorials and Protests

Just some random thoughts
About Tutorials
Some months ago, I was surprised for the «Tech-Blog-Silence». Many blogs about GNU/Linux and FOSS stop to write saying «there's nothing to say». Obviously, if you talk just about GNU/Linux Desktop there's not always a revolution to talk about. How many years we waited to pass from X11 to X.org? And how many to pass to Wayland? In this blog I would talk about programming, computer science, sometimes about math and often about technological trends.
I would like to write some tutorials about Java, iOS programming. Maybe also about C++/QT and pure C and python and Lua. If you have some preferences, please tell me.
Obviously, I'll continue to write my annoying rants :)

Meanwhile, in Italy: Demonstrations and Prime Minister
In Italy, protests against new university law (Decreto Gelmini) continues, moving students and professors. Italian Prime Minister said "True students are at home, studying". I disagree this statement: everybody, in a democratic country, can express his ideas and, even if there are always some extremist factions, it's unfair to despise ALL demonstrators.

Meanwhile, in Italy: Wikileaks and Prime Minister
All goverments are trying to controll the recent wikileaks flood, hunting for Julian Assange, investigation to find the chatterbox. In Italy, the Prime Minister say that all leakes about him are just lies. He says goverment works well and these infamous informations was provided by payed girls, fourth-category politicians or communist newspapers.
I didn't know that to be vice-abassador of USA in Italy is a «fouth-category» office, neither that «The Guardian», «The Economist» and «The New York Times» are pro-communism.
Maybe in Italy we are detached from reality.

Tuesday, 30 November 2010

Universities: Passing the Torch of Culture

«In this post I expose some thoughts about italian universities and italian politics. I hope to be not offensive, it's not my intention. I hope to stimulate a constructive dialog, giving my personal ideas and experiences as "experimental datas"»
Who wants to sit on my chair tomorrow?

In these days italian students are protesting against a new law for university. Googling a bit you can find some commentaries to the law (they're more understandable than law's text), wich can help you to get the more important features:

  1. universities could become foundations
  2. professors will give resignation when they'll become 70
  3. new professors will be recruit by a commission of 4 randomly choosen professors
  4. small universities will be encouraged to fuse themselfs

It doesn't sounds bad. But there are several "cut offs" in universities budgets, explicable only with a decision to reduce public financing. The "fight against barons" (this is a nickname used for more influents professors whom manage recruitings, balance, ecc.) will not be resolved in this way.
I appreciate these contestations, because they prove youths' interest about public life and public instruction. But these protests should be oriented to two more important points, instead to defend just the "status quo".

  1. Professors can be disheartened by University's Senate and could be fired
  2. New professors can be admited to a recruiting contest only if they are international figures (such as Andrew Tanenbaum, Dario Fo, Carlo Rubbia, ecc.) or they have a successfull 10-year curriculum in a company

Bruce Sterling wrote on its «The hacker crackdown» what he thinks what is universities' mission. Have you ever tried to think about it? What's the universities' mission? Sterling says it's «passing the torch of culture». It's true, but not complete. Universities must

  1. Prepare future managers
  2. Prepare future high technical staff
  3. Passing the torch of culture to future generations

In our universities, computer-science program isn't very updated: a new bachelor in CS, usally knows just Java and C/C++/PHP; knows just a bit about UNIX, doesn't know exactly what's a TCP/IP port; has difficulties to understand the differences between a process and a thread; he doesn't know how works ANT, Make and other usefull tools; he doesn't know anything about real-time systems and programming; he doesn't know anything about NO-SQL databases, sometimes neither they exist

This is unacceptable, because it's like an eletronic engeneer whom ignore microprocessors. This new bachelors have also difficulties to learn new languages and techniques.
There's many things to change in italian universities.
I hope this protest will not stop when money will return.

Friday, 26 November 2010

Sony snaps GNUStep and Objective-C

Again, I want to excuse me if I didn't write some news yesterday. Unluckly, there were some hard tasks to complete at work.

Are you serious?!
 A great news from Sony: its new project SNAP is launched and it will be based on the GNUStep framework. GNUStep is the free reimplementation of NeXTStep/OpenStep, a set of classes written in Objective-C back in the '90 by NeXT (Jobs's company which was later absorbed by Apple).
Why I'm so excited? Because Objective-C is a great language (I like it much than C/C++); OpenStep is a great framework (take a look to NSString; to NSArray and NSMutableArray. You'll like them); and because OpenStep is Cocoa's father... and Cocoa is Mac OS X's and iOS' base.

On SNAP's Manifesto I read that is focused to «modernize the framework and optimize it to target modern consumer electronic (CE) devices». But Sony says also

«These modern conveniences include such features as touch displays and 3D graphics»

This could means Sony will allow Objective-C/GNUStep also on their consoles, making easy to port a game from iPhone/iPad/Mac to PS3/PSP. Because, remember, bot PS3 and iOS use OpenGL EM as graphic library. It's a interesting prospective.
What will be the benefits of using Objective-C + GNUStep instead of C++?

  1. GNUStep is a wider standard library than C++
  2. Objective-C supports reflection
  3. Objective-C has a more elegant syntax
  4. Objective-C 2.0 has a garbage collector... if you want and where you want ;)

Obviously, there's also drawbacks:

  1. Objective-C is slower than C++ (just a bit, but it's slower)
  2. Many libraries are written in C++
  3. C++ has a better known syntax

Anyway it's another important step: Objective-C has a completely different way to think OOP (message passing, reflection) which is different (and more advanced) than C++. GNUStep is a big framework. I will be surprised if the Big Next Step (bad word joke ;)) in programming will be made by the "old" Objective-C instead than Java, C# or something else.

Wednesday, 24 November 2010

Meanwhile, in Microsoft Russia...

In USSR, Windows installs you

Nikolai Pryanishnikov, Microsoft Locale Director in Russia, says that

«We must bear in mind that Linux is not a Russian OS and, moreover, is at the end of its life cycle»

I am a bit disappointed by this statement: I always belived that Microsoft managers were smart enough to avoid some hazardous declarations. Says that «Linux is dieing» is a manifestation of ignorance.
Actually, GNU/Linux is a player to face in important environments such as

  • Embeded devices
  • Servers
  • Mobile phones with Android

I suppose mr. Pryanishnikov was talking about the desktop market, but he must understand that HIS statement is offensive for HIS company.
As president of Microsoft in Russia, Nikolai Pryanishnikov hasn't capacity to say everything he wants: he have to think well what he can he says and what he can't. A misunderstanding in business world can move billions of dollars. A good CEO will take the situation in his hands, to correct this clumsy sentence.
Mr. Pryanishnikov misses to report citations and, more important, to say something about the Google OS. It's right that it's not yet released and that its base (Google Chrome) is just at 8% of broswer market shares (source: Wikipedia). But wait: Android is at 25%, while Windows Phone is at 3% (source: Wikipedia) and mobile market is becoming more important than ever. If Mr. Pryanishnikov was a smart president, then he would know that HIS Chief Software Architect (Ray Ozzie) sent a memo less than a month ago where he talked about the new marked trends (mobile, cloud computing, ecc.). Ignore what says your Chief Software Architect is clearly self-defeating.
If GNU/Linux's zealots are annoying, then Microsoft's supporters aren't better. But a fanboy's rant is forgettable; a manager's rant in public is not.
Mr. Pryanishnikov must think very well where, when and what he says, giving some attention to the same market shares I found googling some minutes. If I can do, then he can too.

Tuesday, 23 November 2010

Chainsaw with Jigsaw

«Groovy!»
The Jigsaw Project is "java kernel"'s offspring: the idea of cutting off pieces from the JVM and from classpath to obtain a small "java kernel" able to download and install by itself all necessary packages and libraries.
We'll get Java with a 1M installer, no more. It will runs some small programs, such as a telnet client and clones of ls, cat, more, ecc. utilities. When you will create a Java application, the new JVM will automatically download all missing libraries.
I don't believe Jigsaw will resurrect the Java desktop, but there's some scenarios we must reflect about. In these days Apple is trying to integrate its desktop with its mobile solution. The Mac App Store and the iPhone App Store are similar ideas; the Mac Book's trackpad is the same multitouch device of iPhone's display; Mobile Me's services work as a glue between your different Apple devices. In this "extremely dynamic scenario", informations (e-mails, photos, music, calendars, ecc.) are always in your hand. Why no expand this scenario to applications? Why no expand this scenario to your copy and scores of Angry Birds, or your copy and save games of Rage, they'll runs everywhere, downloading just the necessary informations from the network? It's true you can't have a Crysis written in Java, but a «Monkey Island Series» yes. This concept of «Buy Once, Run Everywhere» could potentially kill a set of market stores (where's the difference between a PS3 and a XBOX 360 and a WII if a donwloaded game runs everywhere?) and makes me doubtful about this prospective. But there's some players that could be interested on platform indipendence.
I introduce you an example: I buyed "Diablo 2" some years ago for PC. Blizzard gives me the opportunity to dowload a version for Mac OS X spending no money. They did a good work and I appreciate its honesty. Porting a game is an hard work, but in Java is trivial. Game's world is moving from power-players to casual-gamers and casual games don't need high performances or fancy graphics.
Java Jigsaw Platform could be a big opportunity some years ago. Now, with this commercial pressure oriented to kill every kind of portability, I am non very confident that Java will emerge from desktop sea.

Monday, 22 November 2010

An interesting limit: for Mark->Steve, Ubuntu->Mac OS X

In an article reported on a italian website, I read an interesting comparison: can Mark Shuttleworth be the Steve Jobs Of Ubuntu?
It sounds like a provocation, but actually every Great Project (GNU, Linux kernel, Debian, Ubuntu, Slackware, KDE and even Apple and Microsoft) has a leader whom manages and guides all project's members. Debian had Ian Murdok; Slackware has Patrick Volderking; GNU has Richard Stallman; Linux has Linus Torvalds; OpenBSD has Theo de Raadt.
Steve Jobs is one of most famous CEO in the world: his keynotes are shows. He saved Apple from a deep hole (in 1997 it could be sold to Sun Microsystem) and transformed it to one of most powerfull companies in the world. Some "never-seen-before features" was already present in GNU/Linux (spaces - virtual desktops, e.g.) but with Job's personal touch, they become "new" and "amazing".
In FOSS (Free-Open Source Software) environments, Jobs sits on the Throne of Evil  once reserved to Bill Gates. With no doubts Apple's devices are much closed than other (iPhone is a small fortress); no doubts that an HTC with Android is more free and more open than an iPhone. But you can't classify Jobs just as "a tyrant", "a criminal" or "a shark".
Ubuntu is trying to earn space in the desktop market. It's a very competitive place, where command line is evil. I think Mark Shuttleworth MUST BE the "Steve Jobs of Ubuntu", a landmark for desktop developers and for desktop users. He's taking seriously this role since one year, with some revolutions on Ubuntu, such as window buttons on the left corner, Wayland to replace X.org and Unity instead Gnome Shell. This ideas have been criticized by Ubuntu comunity but their motivations sounds like a "zealot reaction", not as technical observations.
Mark Shuttleworth is becoming more influent. If his vision is right and clear as I hope, maybe Ubuntu could become more widespread. As every man, Steve Jobs has good things and bad things. But as a CEO, we must admit he's one of the best.
FOSS zealots must to understand that making Ubuntu more functionally, more actractive and less "integralist", will help the GNU/Linux desktop adoption. And if «Mark Shuttleworth will approach to Steve Jobs», could be a way to realize this dream (lim Ubuntu=Mac OS X).

Friday, 19 November 2010

The power of a "Singleton"

«Please, excuse me if I didn't update my blog in these days. I was so busy in some activities, both at work and for my movie. Yesterday, with Fabio, I looked for a good place to realize some scenes. We were lucky!»
A singleton doesn't need 1.5 GW
In this post I'll talk about singletons, last design-pattern I found. If they're necessary in Cocoa, they're also usefull in other programming contexts. What's a singleton, exactly? On wikipedia it's defined as «the instantiation of a class to one object» and well, it's right.By a programming point of view it looks like (in Java):

class MySingleton{
    private static MySingleton shared_singleton=null;
    private int counter;

    
/** * Private constructor */
private MySingleton(){ counter=0; }
/* * Increment counter */
private void Increment(){ counter++; }
/* * get the singleton */
private static MySingleton getSharedSingleton(){ if (shared_singleton==null) shared_singleton=new MySingleton(); return shared_singleton; } }

You see, when you're using a singleton, you have just one instance of this object (here it's called shared_singleton). You can get a reference to this object with the getSharedSingleton static method.
In Cocoa touch, singletons are the most confortable way to have a shared resource; global variables, (very used in C) aren't good as a singleton. In Java you can insert them easily in a multithreaded context, just adding the synchronized keyword. Singletons are very good to share read-only data and they can be used in every object-oriented programming language.

Singletons could be dangerous, because (if they're not well designed) they could know too much about other objects, can grow in functions if they're not well designed a can give throubles in a multi-threaded environment. A detailed description about dangers of singletons abuse is Use your singletons wisely by J. B. Rainsberger, employed at IBM. It's an interesting lecture if you are a programmer. Another good lecture is Singleton Pattern description, where there's also some example of "Singleton Abuse".

Tuesday, 16 November 2010

The Last of the Patriots

The Game
Paul Davis is a game designer and a programmer. He worked on "Manhunt" and "Grand Theft Auto" as level designer. Two years ago, following a link, I found his site, «Lastofthepatriots.com» (unlucky, now it's a dead-link), where mr. Davis hosted the omonimous project. In «Last of the Patriots» (I and II), he created a game engine from scratch, using OpenGL and Audiere as base. His main objective was to write a game focused on story, moral decision and player involvment. The result was «Last of the Patriots», which can be defined as a "Visual novel with a Zelda-Like visualization": you will not shot so much and there's no special-FX or 3D graphics. Last of the Patriots has a story, based on a movie script. You will be surprised about how your feelings will change playing with this game. "Last of the Patriots" is an interesting game experience and I suggest you to give it a try.

The Technical Notes
After a while, I sent an e-mail to Paul Davis, asking him some technical informations. I was really interested about indipendent video-game development and I recived his reply with enthousiasm. I was not disappointed: mr. Davis was exaustive and well prepared.
First: he used OpenGL for graphics and Audiere for the audio parts. A port to other operating systems, anyway, shouldn't be easy, because the input is managed by the win32 API (OpenGL hasn't a great input manager). All the graphics was realized using a 3D program to modelize and animate. Every frame is a shot done to this animated models.
Game logic is realized with an embeded scripting engine written by Davis. I can't say so much about that, because I didn't find any script.
The whole game engine it's coded using C (no C or Objective-C) for personal choice (he prefear to use C). He admits, anyway, for a similiar game, Java could work well.

Conclusion
I encurage to play with with «Last of the Patriots» and its sequel, because some misteries (first of all, the title) are explained. I don't know if the official website will go online again, but you can still download the binary from this link.
Last of the patriots is a very particular game. Give it a try: it could open to you new interesting prospectives. Mr. Davis said he write it to focusing on story instead on graphics.
He did it.

Friday, 12 November 2010

Welcome back, Firefox!

Yeah, we missed you, Firefox ;) Welcome back!
I was a great Firefox supporter. I work also in a public administration and I suggest always to install it: Zimbra works well under Firefox (we still use Windows 2000), but in last year I become less enthusiast. When Chrome and Safari entered the browser market, Firefox appeared disoriented. It introduced some new features (Persona, Jetpack), but nothing really interesting for the end-user; I never used extensions (never found one really usefull, maybe I'll search better) and Chrome and Safari beated Firefox in brute speed. But now I'm using Firefox 4 (Beta 7) and I find it really fast.
What's new on Firefox 4?
First: a the javascript engine uses a JIT compiler (JägerMonkey), just like Chrome.
Second: graphics it's rendered using DirectX (on Windows) of OpenGL (on Mac OS X and GNU/Linux)
This second feature means also Firefox can use the new WebGL standard.
Groovy!
I suggest you to give it a try if you passed to Safari/Chrome and you feel the lack of some extensions. You will have a surprise.
And a final joke: What  did Firefox so far? The playmate? Errr...!

Thursday, 11 November 2010

iPhone SDK Troubles - Web Services

«I'm sorry Dave. I'm afraid you have to write it by yourself»

Write a web service should be easier than implement a new protocol. The iPhone SDK is an endless source of surprises to me. I discovered there's no methods to encode a NSString in base64; no library to write easier data and HTTP headers; and just a small C-function to get a MD5 hash. Ok, it's enough to get an hash, but it's not well embedded on Cocoa-Touch.
Ok, I can still write them by my own. But I really can't understand how is possible a network-oriented platform as a smartphone doesn't have this basic functions. Do you think they're not "basic"? Well, Python and PHP have them as built-in modules. And on Android? I found the answer here.
Anyway, a web service on iPhone is just a NSMutableURLConnection, with HTTP headers setted with [theRequest addValue:@"text/xml" forHTTPHeaderField:@"content-type"], passed to a NSURLConnection.
But I will pass the night writting the other necessary functions.

Wednesday, 10 November 2010

Buy now your new iBrick!

Think heavy

A news from Ars Technica


I am (for work) a iPhone 3GS user. I like my phone, it works well and it has a large application store. ACTUALY I haven't troubles with my iPhone. But last Apple politics doesn't calm me down. When I bought my Macbook Pro I was convinced I could buy another PC after five or six years, maybe even more. I'm still convinced this is true, because Apple's tight integration between hardware and software is good. But his "tight integration" is becoming a hell on iOS devices.
Building a software good for a Mac G4 and a Mac-Intel isn't easy. Anyway, if I should have Mac G4, I will probably accept it's slower to launch a program than a new Mac-Intel. But a smartphone isn't a computer and it has a completly different "way of mind": when I compose a number, if I have to wait three seconds, I think something is going wrong; if I see a delay when scrolling my contacts list, then I think there's too many processes in background.

Apple forbids iOS downgrade to prevent users to pass to a jailbreakable version. I can understand this. But I can't understand how Apple can pretend that a user gently accepts to choose between a unusable "brick" or 500$ to buy a new iPhone 4. I can understand that supporting patches for three or four iOS versions (e.g. 2.x.x, 3.1.x and 4.1.x) to grant a unbreakable OS for different hardware isn't easy neither cheap. But these solutions aren't well accepted.
After the drop of XServe line, I expect more attention on its consumer products.

Tuesday, 9 November 2010

This is my Way (land)

This is "MY" way: AC/DC, no Sinatra!
 In these days, the FOSS world seems shaked by a earthquake called "Ubuntu". Mark Shuttleworth, founder of the most widespread GNU/Linux distribution, in a post on his blog, talks about the possibility to replace X.org and the whole X-Server architecture with the modern and clean Wayland display. Mr. Shuttleworth admits X.org is a living project, more active now than ever and he admits and it's die hard. But, also, he said

«[..] we don’t believe X is setup to deliver the user experience we want, with super-smooth graphics and effects.
[..] We’re choosing to prioritize the quality of experience over those original values, like network transparency.»

Yeah mr. Shuttleworth! This is the first time I heard statements like this since the Waldo Bastian's analysis about KDE[1]. Statements about the importance of user experience over technical decisions

Now, let's see differences between X.org and Wayland

X architecture (curtesy of Wayland's site)


As you can see in this graph, rendering a frame is a very long path of APIs. And this is because (from Wayland's site):

«In general, the X server is now just a middle man that introduces an extra step between applications and the compositor and an extra step between the compositor and the hardware.»

Now, let's look Wayland's architecture:

Wayland's architecture (curtesy of Wayland's site)


In this graph, you see that Wayland embeds the compositor. This reduces passages to render a frame and then, accelerate displaying speed. But, most important, Wayland embeds the detection of window whom recive an I/O message (such as a click). This task, in X, is done by the compositor: in Wayland is part of the display manager.

Wayland, also, it's smaller than X.org and less resources-hungry. These features makes it perfect also for small computers, such as netbooks and tablets. Furthermore, X can works as a Wayland client and this can help the passage to the new display manager.
It's true Wayland doesn't work on old hardware. But this means we'll never use it on a Pentinum I 233. About this, mr. Shuttleworth said

«The requirement of EGL is new but consistent with industry standards from Khronos»

And he's right. Now the bad news: NVidia (AaronP's words) say

«We have no plans to support Wayland.»

Does this means the end of Wayland project? Who knows? Actually NVidia just said ACTUALLY they have no plans, so in future this decision could change. Anyway, a passage to Wayland is a big step for FOSS world, at least as the begining of KDE project.

[1] I read this analysis on a italian magazine (Linux &C.), when KDE 3.0 was released. I can't find Waldo Bastian's post anymore, because all forums are now dead-links. Can you help me to find it?

Monday, 8 November 2010

Adieu XServe

I'm sorry too, XServe
Another move from Apple. XServe, the Apple-based server rack, will no more shipped from january 31 2011. It's a big delusion. I always dreamed to work on a XServe with Mac OS X Server. I was so curious to test its power, the usual tight integration between hardware and software and its user interface.
Will Apple exit from the servers makert? I don't know: is still possibile "convert" a Mac Pro or a Mac Mini into a server installing Mac OS X Server and this document shows how to do. Probably, they're just cutting unnecessary hardware. But, as you can read, to have the same power of a XServe you need from 3 to 30 MacMini; obviously, a MacPro is a bit bigger than a XServe.
In few words, Apple admits its big-iron line is no profitable. Apple is great for the client, but sysadmins with really big requests, still prefer to work with IBM or SUN servers with GNU/Linux or Solaris. Well, «this is business» (citing Lance Vance).
But, let's reflect: a Mac Mini costs 600€ and it's ready to work out-of-box if you install Mac OS X Server. Ok, it's not powerful as a SPARC T3-1 Server or a "home-build" server with GNU or *BSD, but for a small company whom needs a mail server, file server and a SVN it's enough. With the same price you can purchase another Mac Mini as "backup unit". If there's a system fail, with Time Machine it's easy to restore all services in an hour. Ok, it's not enough for a medium or big company. But there are more small companies and some of them already use Apple solutions. Apple isn't interested to face IBM or Oracle on the big-servers battlefield. But these small companies, with 8-20 employees, can't buy (and aren't interested) on big-irons: a Mac Mini or a Mac Pro are more affordable than a Blade Server and easier to configure and to mantain than a GNU/Linux or *BSD.
So, is this the end for Apple's servers? Who knows? Market is a strange beast. Maybe we'll see clusters of Mac Mini; maybe Mac OS X will allow to "assemble" some Mac Minis into a single system; maybe the Mac Pro will become an "ibrid" (iBrid?) between a big client and a medium-sized server. Let's see. I hope we'll enjoy the show.

Friday, 5 November 2010

Stop smoking

Wanna try to stop?

I'm thinking about my Facebook page. What I found there? Some interesting news (in Italy a politician, Nicky Vendola, updates his status at least two times at day). Some news posted by me (mainly blog uploads and shared links). And a lot of (useless) status uploads and a lot of not very interesting news (some are near to spam). I start thinking to unsubscribe from the "Blue Social Network", because all interesting news can be found with a good RSS reader, but soon I realize I can't.
Facebook isn't a "funny site" anymore: it's the "main street" of internet comunication. Your Facebook page is your business card to the world, more than your blog or your web site. It's the best place to put your advertising, 'cause it will be sent to all your Facebook contacts.
You ARE your Facebook page; you ARE the news you post and the links you share. All these informations draw your profile to the other Facebook users.
When someone is looking for you, he will go to Facebook. I can survive without Facebook, using just iChat connected to Facebook's jabber server; but all other social activities (publicize a movie, a post on my blog or just share a newspaper article) can be done only with Facebook.
Facebook is a powerful tool, but for some days I felt sad on last sessions. Why?
My answer (just my 5 cents) is simple. I feel like when I tried to stop smoking: I felt strange some days because I didn't know what to do. Smoking was like a "filler", a way to don't get bored while I didn't any activity.
When you're on a boring office, with no chance to talk about interesting things with your colleagues, what can you do? Connect to the blue site.
Well, I decide to stop "facebooking" with no reason. I stop using it with a precise idea about what to do in it: no "busy wait" for a chat, not looking all 200 demotivational posters 'cause I don't know what to do. I'll start to use it conscientiously, as the last BBSs' offspring.

PostScriptum
Take a look to this old soviet device. It was created three years before Photoshop 1.0. Interesting :)

Thursday, 4 November 2010

Unity and so be it


Is this a flame? Maybe. But I support Canonical's decision to use Unity instead Gnome Shell. Why? Because this "fork" is fundamental, not just for usability reasons, but mainly because it's a strong stance from the most widespread GNU/Linux distribution. Adopting Unity, Canonical did some decisions, oriented both to usability and development. Choosing Unity, Canonical show us its direction: the underlying technology is useless if user experience is poor. KDE 4 is on this way: great technology (C++, QT, Plasma, Phonon, ecc.) but the resulting desktop is not "sexy" as Gnome on Ubuntu.
Technology can't be the objective. Christopher Tozzi said


I put emphasis on some statements, expecially about "to borrow ideas": this is the base of Free Software, which is not (as many thinks) a "Hippy-Programmers Way of Life". Free Software is much more near to "Ideal Market" and "Concurrency" than other realities (such as Apple and Microsoft). Same ideas are implemented by different teams in different ways (e.g. EMACS vs VIM; Eclipse vs Netbeans; Gnome vs KDE; Firefox vs Konqueror and so on). This environment allows to many competitors to survive. But Canonical must monetize its work. And to do this it needs to follows user's desires. If Gnome shell isn't actractive enough, then welcome Unity.

Wednesday, 3 November 2010

Why EMACS is still important

«Hey you! Is that a VIM session?!»

Talking about EMACS in 2010 may seems stupid. But GNU's official text editor still maintains its charm, despite nearly three decades. Born from Richard M. Stallman's keyboard, EMACS is the editor of choice of different programmers scattered around the world, thanks to its main feature (it's programmable). I decided to try EMACS after a fight of years.
If you use EMACS enable immediatly the cua-mode The cua-mode sets the "copy-and-paste" and other controls with the keyboard shortcuts control-v, control-x, z-control, control-c, making more "human" the user interface and avoiding you to spend two hours on manual to understand what is a "region", the "yanc" command and the "kill" command. Using this "trick", I finally started using EMACS for real and not just for fun.

EMACS vs XCode vs Netbeans vs Eclipse vs...
EMACS can compete against an IDE?" No. It can't, at least not right now. It 's true EMACS is a text editor with a LISP engine then theoretically you can turn it into a IDE, but I've never seen anything like it. And I'm not interested to see it. The "universal solution" (or "silver bullet") does not exist. If I'm working a small project in C / C, a bash script or a python script, then I'll probably use EMACS; but to develop an iPhone App, a java servlet or a 3D engine I'll use a IDE to do object's methods analysis, to compile the project with ANT, to upload it to SVN, etc..

Why should I use EMACS?
Just for fun. To learn LISP. But the main reason is because EMACS is on any operating system. Well, if you're writing a shell script on a SSH session, then VIM is enough. But when you need to write some Python scripts, with C code, then EMACS is the choice. It's everywhere, has a huge library of macros, will work on Mac OS X, Windows and GNU / Linux and ALL EMACS macros (including "search and replace in selection "," Indent Code","open a shell","Open SSH session to server","Send a email to", ecc.) work on ALL systems which run EMACS . If you are a programmer, probably you will find difficult to use the C mode. Well, you have to configure the c-major-mode to work as you want. At the end of this post, I'll show my personal .emacs file.

EMACS isn't good as VIM
If you talk just about personal preferences, I don't say anything: everyone has his preferences. I was a VIM user, but I finally passed to EMACS after some days of XCode. Why? XCode indents wrong my Objective-C code if I forget some parenthesis (using Objective-C it happens very often). I felt the lack of this advanced feature and I realized that Emacs can do the same, changed my feelings on it.
There's some things I would like to see in next EMACS release: first, set the default use of cua-mode, because in 2010 just two or three nerds are comfortable with "kill and yank". Second, set UTF-8 as default encoding. Third, a modern configuration interface (the "Customization Macro" was useful in the '80, but now is a bit obscure).

My .emacs file
Somebody has a very large .emacs file, filled with his own macros. I just want to use the c-style called BSD, with tab character as indentation method and four spaces as tab width.

;C programming style
(setq c-default-style "bsd"
         c-basic-offset 4)
;for c-mode, use tabs instead of spaces.
;Set tab width as 4 spaces
(setq-default c-basic-offset 4
                  tab-width 4
                  indent-tabs-mode t)

Last advice
Take a look at the Emacs Wiki: I found it very useful.
Happy hacking :)

Tuesday, 2 November 2010

Oracle's short vision

Mr. Magoo, you are more fun

News of the day (october 29 2010). From Computer World I read «Oracle: Google 'directly copied' our Java code». In this article is exposed how and where Google did copy Java's code. From the article we read

«The infringed elements of Oracle America’s copyrighted work include Java method and class names, definitions, organization, and parameters; the structure, organization and content of Java class libraries; and the content and organization of Java’s documentation»

I organize this declaration in the following list
  1. Java's methods names;
  2. Java's class names, definitions, organization;
  3. Java methods' parameters;
  4. the structure, organization and content of Java class libraries;
  5. Content and organization of Java’s documentation;
Pay attention, this statements aren't so unfamiliar. Do you know where I (and probably you) did hear them before? When SCO accused Linux of copyright infringement. SCO was more prepared, saying mr. Torvalds copied directly UNIX code into his kernel (Linux). In the final chapter of this ridiculous theatre, SCO showed this famous pieces of code: few macro definitions of the errno.h header file. These macros look similar to this:

#define PI 3.14159265

The SCO-Linux controversies was more complex and still continue. Anyway, SCO showed some code and, on a computer-science lawsuit, gave some material to think about.
Let's return to Oracle's assertions. How much are they real? As a programmer with some knowledge in law and licensing, they seem ridicolous.

  1. You can't set a copyright for a "Function Name". Neither for a class name or a declaration. They're not "trademarks". The only trademark could be the "java" prefix in some cases (such as java.*), but I have doubts, because there are placed to grant the Java Standard Definition;
  2. Look point 1;
  3. Look point 1;
  4. As I say on point 1, I can understand Oracle dislikes the word "java" in a whole classpath, but this is placed to follow the Java Standard Definitions;
  5. This is more difficult: as you can read here, Oracle's documentation distribution isn't released under the GNU Free Documentation License.
If you think well, there's something else to say: Android is based on 3 things: a Virtual Machine (Dalvik), a language (Java) and a classpath (Harmony). Google never said Android runs Java and they're right, because Dalvik is very different to Hotspot; Harmony is a java classpath reimplementation released under the Apache License.
By the way, "Java" was released under GPL by Sun Microsystems just before it was purchased by Oracle and just some pieces are still under a non-free license. So, Oracle's charges to Google seems inconsistent to me. If the problem is documentation, then Google can pay a team to rewrite it.

I think the main reason to start this lawsuit campaign is the lesser importance of the Java Micro Edition platform. How Oracle (a server provider) could be afraid on losing the mobile market? The answer is Oracle feels threatened by the decreasing importance of selling licenses to use the Java Micro Edition. Sun Microsystems earned selling to smartphone producers (Nokia, Erikson, Motorola, ecc.) the autorization to include a JME on their products. I think this was the only profittable Java client platform. Now that JME is going out  of market (killed by iOS and Android), maybe Oracle is playing dirt to give new life to this project.

Friday, 29 October 2010

iPhone SDK Troubles - UIPageController

«I'm sorry if yesterday I didn't post anything, but I was busy. I did first shoots of my first movie with friends of mine, Fabio and Tommaso. We worked well for this project and they gave me their enthusiasm and their grit. Soon I'll show you something about this new job»

«I'm sorry Dave, I'm afraid I can't do that»
First article of my «iPhone SDK Troubles» serie. Today I'll talk about a annoying problem: the impossibility to customize the look of a UIPageController.
Problem is simple: I want to change background color and dots on a UIPageControl. As far as I know, after long search, there's no a way approved by Apple to change UIPageControl dots. This means you can still change their look (on this article there's the explanation), but the application could not be approved in the App Store.
You can still change the background in two way:

  1. if you just want to change the background color, just set it on Interface Builder
  2. if you want a background image, place a UIImageView under the UIPageViewController and set its background with opacity at 0%
I think it's ironic I can change the background and no the dots. They're always white, so a light background make them invisible. I hope in Apple they consider to add an API to manage them.

Wednesday, 27 October 2010

Gosling's Reply

«Ok guys... I've something to say...»
James Gosling is one historical True Hacker. He's famous for his version of EMACS (Gosmacs), for its work at Sun Microsistem and to be known as "Java's Father". Soon after Sun aquisition by Oracle, Gosling resign, because he didn't like Oracle's politics.
Yesterday (october 26 2010), I read from an italian site Gosling's reply to Steve Jobs' declaration about Java remotion from the new Mac OS X (Lion). Googling a bit, I found the original article  at Gosling's blog.

In few words, Gosling said that if Apple is going to remove Java is because Apple wanted to write on their own the official Mac OS X JRE, compliant to Sun's standards. Apple's programmers worked hard and well, but some old design decisions make development a hell. Simply it's hard to continue to support a full Java implementation. Apple has lost interest on Java in a moment where its Macintosh starts to increase in popularity together with its development platform (Objective-C and Cocoa).

Gosling also accused Apple to say a lie when Jobs declare that «Sun (now Oracle) supplies Java for all other platforms». As I said before, there's no version of Java for Mac OS X on Oracle's site.

What to say about? If Apple supported its Java Runtime was mainly because they belived it would become the standard client platform (in a era when converting applications from Windows to Macintosh looks improbable). When Macintosh became more widespread (since 2002), Apple's forces on their Java Runtime were moved elsewhere.

But the Apple development model for Java was right: a JRE wrote from the same software house who write the OS as part of a stack wich starts from hardware, proceed to kernel, continue with OS's APIS and ends with an optimized virtual machine and a well integrated classpath. Apple followed this road, giving us the fastest and best integrated Java experience. Microsoft did the same, but soon started to change some features of Java, making Java programs written on windows incompatible with other Java implementations (who said Embrace, Extend, Extinguish?). Maybe, if Sun would has thought seriously about a politic of openess in Java development (maybe involving IBM and Oracle to define Java new features, standards, ecc.), we would have a Java Kernel, a small and efficient JRE (just 3 Megabyte), a more efficient Java GUI (a DirectX-based Swing? Something similiar to SWT?) and a set of tools to deploy more easily a Java application (Java jar could appear as a Setup.exe on Windows, a .DMG in Mac OS X, a .deb on Debian...).

Let's think about Android: Java is Android's standard development platform. If you look where are located Android applications you'll see the "ideal" situation I described.

Android's architecture. Runtime have the same role of JRE
Unluckly, on desktop systems, a Java client means less differences among various platforms. Can you realize what this means for a market where you want to gain monopoly?

Yeah, you know the answer.

Tuesday, 26 October 2010

HipHop: emotions no-stop

Mhh... there's something strange....

Some months ago, Facebook annunced its new "HipHop" technology, used to speedup its servers. HipHop is a translator: when a page is requested for first time, HipHop transforms PHP source code in to a C++ source and then compiles it into native code with g++.The executable then is launched:its output is passed to the webserver.
Haiping Zhao, senior engeneer and HipHop chief, said HipHop speeds up to fifty percent Facebook servers. No bad, really. So, will we start to develop with HipHop in mind? I don't think so.

HipHop born with a specific target: to increase performances of one of most visited sites on earth. But, after thinking about it for some weeks I realized that better solutions could exist. Assuming as fundamental requisite that is impossible to rewrite ALL Facebook in another language and that most of programmers involved knows only PHP we can:

   1. write a PHP-to-Bytecode interpreter. JVM on server side is great: it's scalable and it has a JIT.
   2. create a front-end to LLVM. LLVM is a Virtual Machine AND a compiler. Apple is founding LLVM project, probably to use it as default compiler for its systems. LLVM supports multithreading.

HipHop technology seems to me as a return to the CGI. After ten years of mod_php (created to avoid CGI), Zope and other ways to reduce the use of CGI, we return to the root of web programming.

In my humble opinion, HipHop looks like more as a "patch" than an "entire solution". Facebook's engineers know well their work and they developed Cassandra DB from ground to up. A great work, well designed and well implemented (Java for scalability and fault-tollerance; replication; choice of a non-relational schema, ecc.). It sounds strange that Cassandra and HipHop came from the same home, because they seem too different. I think Apache Foundation is more coherent: Apache developers work on server-side with Java, with some exceptions in client and C/C  (such as Xerces and log4cxx).

I have no doubts that Facebook's administrators know well what's better for their business and I have no doubts they studied with attention to choose the best solution for their problem. But their solution seems very strange to me, and I would like to understand why they choose to create HipHop.

Monday, 25 October 2010

There's no lions in Java

Alea iacta est


Just after a week of blogging, I decided to start writing in english, to expand my user base. It's a hard decision, because this means to face my ability with this language. But I decided: there aren't more excuses to don't write in english. If I'll do a mistake, I'll learn from it and I'll continue to write my opinions on computer-science and cinema.


Today I'll talk about new Mac OS X (Lion) and its new features: in the last "Mac Event", Steve Jobs presented the Mac App Store (interesting), full-screen applications (less interesting), Mission Controll (a enhanced Exposè), Launchpad (a way to visualize your applications) and... Java remotion.
Well, it's no completly correct: Java now it's "just" deprecated and this means, sooner or later, Mac OS X will be shipped without a pre-installed JVM. This means all Mac users will have to download and install a Java Runtime Environment, exactly as for Windows or Linux.
Jobs justifies this decision saying



«Sun (now Oracle) supplies Java for all other platforms. They have their own release schedules, which are almost always different than ours, so the Java we ship is always a version behind. This may not be the best way to do it.»



Well, looking on Oracle site I see there are JVM just for Linux, Solaris and Windows; FreeBSD have a Java package on its repository of "BSD Ports".
If you can get it for BSD, you can get also on Mac OS X. In worst case, we can compile it from source code (Java is released under GPL).
But tell me the truth: do you really feel the lack of Java on your macintosh? It's about a five months I don't use Netbeans on my MacBook Pro; I haven't installed yet Vuze (Transmission works well too); Runescape... maybe one day I'll play with it.
I said on a precedent post Java habitat is on the server side: clients use AJAX and Flash when you're on web-context; they use C/C++/ObjC on heavy client applications; they use Python, Ruby and Visual Basic for non-intensive CPU applications.
Java on Mac OS X was a wonderful idea, and was the ONLY version of Java shipped on a successful desktop system. The lack of real useful client software was tolerated too long. If Oracle is really interested on Java client, then it must to redesign Java to transform it on a fast, comfortable and elegant development platform.
And if you are concerned about the fate of fussy Macintosh sysadmins, don't panic: they're "pro" enough to install a java runtime environment by themself.

Friday, 22 October 2010

Aggiungi un posto a Valaram?

Ulmo, Vala delle Acque (disegno di John Howe)
Per chi segue il mondo dell'open-source o anche solo quello della programmazione, ricorderà sicuramente il putiferio nato quando alcune distribuzioni iniziarono ad installare di default il framework Mono. La scelta fu ferocemente contestata: il fatto che Mono sia ispirato al .NET di Microsoft  e che, secondo alcuni, ci fossero dei "problemi di licenza", spinse diversi utenti ad alcune iniziative un po' esagerate, tra cui la reimplementare in Java o C++ dei programmi «colpevoli» di essere stati scritti utilizzando Mono (come F-Spot e Beagle)
I più moderati, però, sapevano bene come mai era stato scelto Mono per la scrittura di alcune applicazioni e i motivi erano gli stessi per cui Microsoft realizzò il framework .NET: ossia la possibilità di includere interi pezzi di altri programmi con una facilità estrema: potete includere Firefox come browser nella vostra applicazione usando un import e un Firefox f=new Firefox(). Niente male, eh? E questo può funzionare anche per altri programmi, come Evolution, Pidgin, Nautilus, fogli di calcolo, database ecc. Il motivo per cui si iniziò a lavorare con Mono è questo ed è lo stesso che prima ha spinto gli ingegneri di Microsoft a progettare .NET.
La scelta può essere discutibile, ok. Ma continuare lo sviluppo di Gnome non è facile e non si risolve con prese di posizione "a priori" o coding selvaggio in C++.
Un team separato di programmatori, però, ha iniziato a lavorare in maniera parallela al problema e ideò un altro linguaggio di programmazione,pulito come Java, veloce come C++ e integrato con il sistema a oggetti GObject di Gnome. Il risultato si chiama Vala, ossia "Potenza" e termine generico con cui vengono indicati gli dei presenti ne «Il Silmarillion» di Tolkien.

Io per primo ero dubbioso sulle capacità di questo linguaggio, specialmente per la sua toolchain che, son sicuro, vi farà inarcare il sopracciglio: il programma scritto in Vala viene letto dal compilatore: questo, anziché trasformarlo in codice macchina, lo trasforma in codice C che viene poi compilato dal buon vecchio gcc. Non è poi così strano: i primi compilatori C++ funzionavano esattamente così.
La scelta di creare un nuovo linguaggio non è da prendere a cuor leggero, specialmente in un'era dove se ne possono contare almeno otto di veramente utilizzati e all'epoca liquidai Vala come «l'idea scema di quattro geek»
Invece, contrariamente alle mie previsioni, scopro che Shotwell, il nuovo gestore di foto di Ubuntu, è stato scritto in Vala. E non solo: altri progetti interessanti del gruppo Yorba sono scritti in questo nuovo linguaggio. Pare, insomma, che «l'idea scema» tanto scema non fosse.

Che Vala possa occupare il posto di Mono in Gnome? Potrebbe essere interessante. Che possa cambiare qualcosa nel panorama della programmazione in generale? Improbabile, visto che è pesantemente basato su GLib e GTK; inoltre non sarebbe portabile a livello binario, poiché essendo compilato in codice macchina con gcc.
Sul suo futuro non so davvero cosa dire se non un "staremo a vedere". Però, sicuramente, prima o poi Vala dovrà fare i conti con un linguaggio che concettualmente gli somiglia e che è finanziato dal "solito noto". il linguaggio è Go; il finanziatore Google.

Thursday, 21 October 2010

Intanto, nell'isola di Java....

Il garbage collector osserva attento l'heap di sistema
Apparso nel 1995 col motto di «write once, run everywhere», Java s'è diffuso in diversi dispositivi, con fortune alterne: sui telefonini ha avuto un timido successo eclissato dalla corazzata Android; sui client lo usano pochi coraggiosi e gli sviluppatori che utilizzano Netbeans; le celebri Applet son morte attorno al 2001.
E su server? Beh, Apache Foundation lo usa per praticamente per tutti i suoi progetti; IBM e Oracle lo usano come base delle loro soluzioni software.
Insomma, un successo o un flop?
Dipende. Che Java sia usato è indubbio (i numeri ci sono), ma la destinazione è completamente diversa da quella prevista: nato come piattaforma per i client, Java ha successo solo nell'ambito server. Se pensate a quali sono i VERI programmi client Java, quelli che si usano seriamente, cosa troviamo? Eclipse, Netbeans, Vuze e Runescape. Programmi belli e potenti, ma troppo poco per parlare di un successo. Dal lato server, invece, cosa vediamo? Dando una sbirciatina al sito della Apache Foundation trovo:
e un'altra decina almeno di progetti open che, nel 90% dei casi è scritta in Java.

La celebre Java Virtual Machine
"Java" è il nome "ombrello" sotto cui son raccolte tre cose:
  1. un linguaggio
  2. una Virtual Machine
  3. il classpath (la libreria standard)

La Scimmia Cattiva odia la JVM
 Il linguaggio e il classpath sono apprezzati e ammirati, tant'è che Android usa come piattaforma di sviluppo il linguaggio Java e il classpath reimplementato da Apache (progetto Harmony). Ma se chiedete a uno sviluppatore cosa non gli piace di Java, egli punterà il dito contro l'odiatissima Java Virtual Machine. Nessuno ha mai avuto difficoltà a SCRIVERE un programma in Java, ma quasi tutti si son lamentati della velocità di esecuzione.
In realtà, una volta avviata, la Virtual Machine realizzata da Sun (nome in codice "Hotspot") lavora discretamente bene, con prestazioni generalmente non distantissime a quelle C++, grazie soprattutto alla presenza del JIT. Dai benchmark ricaviamo che i calcoli trigonometrici sono ancora molto lenti, quindi eviteremo di usare Java per applicazioni di calcolo scientifico.
Però vediamo cosa dicono i benchmark di Jake 2, una reimplementazione in Java del motore grafico di Quake 2. Sembrerebbe che Java ne esca con le ossa meno rotte del previsto rispetto al C/C++ dal punto di vista della velocità. Davvero non male, visto e considerato che si può stimare che il codice sia almeno del 20% meno complesso (niente header files e niente puntatori).
Un altro articolo interessante è fatto da uno degli sviluppatori dell'Irrlicht Engine: questi ha scritto un'interessante analisi sulle differenze tra Java e C++ per vedere se poteva essere conveniente scrivere un mototre grafico per la piattaforma di Oracle. I risultati, come dice lui, danno C++ vincitore «anche se Java gli si è avvicinato»; l'autore considera anche che, ricorrendo a ottimizzazioni qui e là, si possono rovesciare i risultati a favore o a sfavore di uno o dell'altro linguaggio; inoltre (onesto e imparziale) ammette che probabilmente i calcoli sarebbero stati più favorevoli per Java se non avesse considerato il tempo di avvio della Java Virtual Machine e avesse utilizzato l'ottimizzazione -server per fare le prove. Le sue conclusioni le trovate sul link che vi consiglio di leggere perché è scritto bene: quel che volevo mostrare è che tutto sommato, le differenze di velocità in esecuzione son sicuramente migliorate.


Alla SUN facevano i server (e si vede)
Sun Microsystems era celebre per i suoi sistemi server di cui curava tutto: il processore (lo SPARC) era progettato da loro; il sistema operativo (Solaris) idem; il sistema di condivisione di file in rete (NFS) l'hanno fatto loro. Insomma, quando si trattava di server, la Sun sapeva perfettamente il fatto suo. E lato client? Non conoscete CDE, NeWS o il progetto Looking Glass? Guardateli bene e forse li troverete familiari. Forse perché
  • le idee di CDE sono riprese e migliorate dal desktop environment XFCE, uno dei più famosi nel panorama open-source.
  • il funzionamento di base NeWS è molto simile al window manager da Mac OS X (che usa PDF anziché postscript)
  • Looking Glass sicuramente ispira uno degli ultimi brevetti registrati da Apple sulle interfacce utente
Insomma, grandi idee, ma forse gestite male. Dopotutto Sun faceva i soldi con i server e l'assistenza. Nel 1993, l'anno in cui si inizia a pensare a CDE, Microsoft stava diventando il colosso che noi tutti conosciamo e Apple, anche se in cattive acque, aveva comunque un nome con cui fare i conti. Questa mentalità da "progettisti software" si riflette anche sulle interfacce utente per la piattaforma Java: se guardate la documentazione di Java2D, vedrete che questa ha una ricchezza e una varietà di classi e di funzioni da far impallidire chiunque; se guardate come è progettata Swing vedrete che è ingegnerizzata così bene che è possibile integrare routine OpenGL come icona di un bottone (guardate qui).
Tecnicamente si tratta di cose impressionanti, ma se lanciate un programma grafico che usi Swing noterete una lentezza imbarazzante (forse Netbeans è l'unico che si salva); se avete programmato con Swing conoscerete bene l'inferno dei suoi layout. Se poi realizzate che abbiamo dovuto aspettare il 2008 e Netbeans 6 per avere un editor delle interfacce visuale (Matisse), quando Mac OS X, Windows e QT ne hanno avuto uno fin dal loro rilascio, non vi stupirete di sapere che Swing lo usano quattro gatti.
Alla Sun, evidentemente, avevano la passione per le interfacce testuali.

Facciamo un server
Se provate a scrivere un server multithread in Java avrete una gradita sorpresa: con meno di cento righe di codice sarà pronto un echo server funzionante che gestisce eccezioni e timeout. Impegnandovi potete fare un server IRC discreto in mezza giornata. Non dovrete preoccuparvi dei memory leak, sarà affidabile grazie al tanto bistrattato garbage collector (sapete che potete sceglierne sei a seconda delle esigenze?). Se poi il vostro hardware non regge e volete affiancarlo con un altro, non dovrete reimplementare niente: la JVM è scalabile, ossia sa bilanciare automaticamente il carico in maniera trasparente. Nel calcolo parallelo c'è il problema delle sezioni critiche; se in C o C++ bisogna ricorrere a semafori o mutex e fare un attentissimo debug (con più thread/processi è più difficile ricreare le condizioni di errore) in Java basta chiudere la sezione critica in una classe che abbia i metodi specificati con synchronized e il problema è risolto.

Il Domani dell'Isola di Java
Dal 1995 di strada ne è stata fatta e di cose ne sono cambiate: il grunge non va più e internet è in tutte le case. La rivoluzione client promessa da Java è stata fatta prima da Flash e poi da AJAX e Sun ha pagato le sue sviste venendo assorbita da Oracle.
La piattaforma Java è ancorata ai grandi server aziendali che devono macinare migliaia di richieste su più macchine, in maniera affidabile e scalabile. Il database NoSQL Cassandra ne è l'esempio perfetto. Ci sarà sicuramente qualcuno che continuerà a far programmi client in Java, ma piattaforme alternative più comode come Python e Ruby probabilmente gli saranno sempre più preferite. Android, che pure usa il linguaggio e il classpath, sicuramente non rifiuterà in un prossimo futuro interpreti python ottimizzati e, magari dotati di JIT.
Non sono in grado di fare previsioni: molti sistemi bancari usano Java come backend e difficilmente abbandoneranno anni di sviluppo e di test in favore di qualcos'altro (un fenomeno simile c'è in ambito scientifico con il Fortran). Tutto è nelle mani di Oracle e, in parte, della comunità Java: a loro il compito di rendere l'isola un centro frequentato, un parco naturale o una riserva.