Monday, May 28, 2007

Congress in Porto Alegre

I've come back from Porto Alegre, where I visited a congress on business administration (export, import and business between France and Brazil). There were some good presentations and the overall event was interesting.

We actually know a couple of people from this region, my brother in law and another couple that spent some time in Cambridge, UK to work with specific professors on their topic of interest. We met with them also. It was a lot colder than Recife.

Just on the outskirts of the city is the lake and a region they call "Ipanema". I had a look at the larger park close-by, which was used to host the "International Social Forum" some years ago.

Well, there was a bit of time left in the evenings luckily, which we spent with my brother-in-law to visit some good restaurants in the area. This particular area had a good lot of serious italian restaurants that I can recommend. Another one was the "Bistro da Rua", which is also a good place to just drink a bottle of wine and chat with some friends.

Tuesday, May 22, 2007

Linux Ubuntu on Dell

Dell is going to offer some hardware that comes with Ubuntu Feisty (7.04) pre-installed. These are exciting prospects for the propagation of Linux into mainstream markets that are non-geek. Does this mean that through the evaluation of Dell, Linux is finally considered mainstream-compatible?

Well, I know for one that Dell runs a lot of Windows software, not just offering it with their hardware, but they also run this internally. So they are not particularly interested in software as a technology (or as religion). We need to consider this move from Dell also from the perspective of support.

Windows, as the OS only and a couple of "productive" applications is a basic platform. Let's see the following formula:

known hardware + known software == x probable support calls

However, when we start installing "3rd party" software into this mix, suddenly the support calls can theoretically grow significantly larger. Not linear, but supposedly also exponential, especially when a commonly used piece of software happens to be incompatible with their hardware.

Ubuntu, as we know, has a repository with a very large amount of software on it that should satisfy most people's needs. I cannot truly remember the last time I downloaded an RPM or DEB from the Internet and installed it outside any repository.

This is a different kind of support that they have probably investigated, the support of testing the known software in the repository on their known hardware. If this provides all software tools that people need and it all works, then this is a fantastic service and may bring costs down significantly on the customer support front. Plus... since more than Dell customers use the software as well, for Dell support to resolve issues, they are no longer alone. They can count on the large number of forums, blogs and the community to help people out.

The question is thus... will Dell treat the community well? If they manage to become a responsible member of the community and contribute their productivty back, they gain much more than what they originally invested, as the momentum of the Ubuntu community should definitely increase. Openness in communication, testing of software on many different machines with many different kinds of users, hardware that works, software that works... Will this generate happy customers? Is this the future of computing? If mainstream picks up, does this mean that the market will *demand* Linux on their computers for community support (no fee!) and pre-tested distributions and hardware/software packages?

Some people are already discussing that Dell might start their own Ubuntu repositories and become full mirrors. If the cost of running those mirrors and hardware/software compatibility testing is lower than running a whole array of customer support reps, then this would actually lower the opex for Dell, which means more profit! exciting indeed...

Friday, May 18, 2007

Swamped...

Bit swamped at the moment, so little time to blog about anything. Will deploy a new release of Project Dune over the course of the weekend, which should include the new timesheet module. That will only be sort of Alpha-ish, since the locking functionalities won't work yet. Also, it needs a breakout class for the timesheet processor, so you can export your times reported to another system.

Well, great. We've been playing with GWT 1.4 for now. The hosted browser fixes and memory leak fixes really do help development, so you could say it really is maturing.

I'm looking forward to the suggestbox, number and dateformat classes and the Rich text editor for the document management part. Haven't had any time whatsoever to deal with HTML<->DocBook conversions though, let alone putting in place some simple templates for PDF conversions using XSL:FO.

Wednesday, May 09, 2007

Performance of VMWare on Linux disappointing?

Probably only for laptop with power-save functionalities... but you never know:

cat /sys/module/processor/parameters/max_cstate

echo 3 > /sys/module/processor/parameters/max_cstate

This value is related to the number of level that the processor takes to save energy (reducing its performance and power consumption when idle).

Friday, May 04, 2007

Computing challenges of the future

Since the first computer we've come quite a long way in integrating the computer into our daily lives. This has made significant changes to the formation of our society, in the context of expectations, beliefs and ethics.

There will be a time though that all processes (activities) that we perform are already automated and then we'll have a number of companies competing on the same level. Consider an application. It performs a certain kind of activity for the user. We have applications that work on private data at home (photo editing, accounting, office work like writing a letter, and so forth). At the moment we see these activities becoming more "online", so that we can do this from any place and not just from home.

The majority of applications has "options" and functionalities available that the user can choose from. By using these options and functionalities, we come close to executing that particular activity that we really wanted to do. There are new activities that can be invented that were otherwise not possible, but in general it is a replication of another activity in a different context that is transferred to another context for another need. (like sending a letter not just by writing a card at home and then posting it, but sending it from your PC and now sending it from your phone, also called email). Different contexts, but essentially the same thing. Our society however changes its perceptions just by using that technology and more importantly, changes its expectations and constant needs. (people become dependent on the technology).

There will be a time though that many of these services or functionalities are already automated and that "aggregated value" by itself can not be provided by just automating such activity. It needs something else.

I guess that one of the challenges of the future is that the computers need to be more helpful in their assistance to making decisions or assistance in detection. This probably means a lot of data-warehousing and processing and analysis and then use the outcomes of this whole process. It goes way beyond any kind of simple algorithm in this case. We need to mathematically understand better how the world fits together.

So the future of IT seems more geared towards "optimization" and "efficiency" than just "automatization" the way it's still enjoying things. After we have automated the majority of tasks and all these companies offer their products, the next step is optimization of these tasks. To be able to better understand activities, you must understand more about that activity domain. Or rather, linking back to my previous post... a computer engineer will not be able to compete on higher levels any longer just by having knowledge about computing or engineering. You'll have to invest in gaining knowledge of other problem domains and then apply the knowledge of one domain to the other, cross-over.