Virtualization of SnowOSX on Windows 7

Why the Apple Mac OS X is important?

This question was never asked by the developer community for a number of years. However, with the recently discovered “gold rush” of mobile apps development, Mac OS X is relevant again.

Imagine an OS that allows development of all mobile platforms and is also lovely enough to brag in front of your girl friend!!!

Unfortunately, not every one can afford an expensive Mac Pro machine, especially not a developer who has just lost his job due to recession.

Well folks, Oracle Virtual Box comes to the rescue. Its free and let’s you install SnowOSX (hackintosh based on Mac OS X 10.6 Snow Leopard) on any Windows or Linux machine. The steps are simple and they are mentioned every where on the hackintosh websites like It requires Intel VT-x or AMD-V enabled CPU, ideally 4 GB of RAM, 20 GB of free HD space and that’s it. After the SnowOSX is installed, you can go on to install Xcode with iPhone SDK and Eclipse with Android SDK. Try as many mobile SDK’s as you like, Mac OS X supports all of ’em.

But be careful as it breaks nearly every Apple license agreement ūüėČ


Ruby and threading for multi-cores

With the CPU manufacturers’ focus on multi-core architecture rather than GHz boost, a lot of debate and research has started to¬†scale existing software applications to take advantage of this development.

Let’s consider Moore’s Law¬†for a moment. This is not really a mathematical law but just a rule of thumb that CPU manufacturers¬†aka Intel & AMD follow. Therefore the software industry expecting that much¬†growth in processing¬†power X months down the road always optimize their software to match the performance of the hardware at the time when their solution will be launched. This growth pattern is followed by all game developers,¬†DBMS vendors, corporate solution vendors like SAP, Siebel etc. and others.

Optimizing software solutions to match raw GHz processing power was easy but matching it with multiple cores, 2\4\8, seems a rather difficult task. As the software languages of today don’t support¬†auto multi-processing. That means to say that a lot of manual coding effort is required to keep parallel processing optimized for parallel tasks. Then there¬†is the insurmountable work of synchornizing the manual threads to avoid deadlock situations, thread starvation, keeping critical section safe and the like. This is the situation with all the modern dominant languages like Java and C#.

On the other hand, its time for another software language shift. Industry consultants predict that after every 10 years, the dominant software language gets replaced by some other language. Case in point is, Cobol, Fortran, C++ and now Java.

The most appropriate successor to Java seems to be Ruby<period>

I found one of *the* most comprehensive coverage of threading in Ruby and the latest trends/research.
But the problem I figured out is that there is no pattern of thought process developing or research heading somewhere concrete.
Although Ruby MVM is posed as the best option available but still issues with this approach are mentioned. Also any successful work on that is hard to find.

It seems threading in Ruby will remain experimental in nature and may only improve with the advent of some other language that has a solid implementation of the threading experiments done in Ruby side.

Ageia PhysX : a true revolution in PC Gaming

Ever since the first announcement of PC Gaming technology called PhysX by Ageia in Computex or some exhibition in spring 2005, I have followed the technology very closely. Ageia is/was a startup by ex-XBox guy and other gaming industry veterans.
It is infact its a software physics engines for computer games and corresponding physics hardware with special purpose hardware chip. They have named this chip a PPU:Physics Processing Unit as you can guess.
Just lately it occured to me what a lofty challenge this is from both technological and business point of view to get this thing established.

Comparisons shots of GRAW

Business Aspect:
First of all there is that chicken and egg problem. Games developers say that since there are not enough physx hardware cards therefore we cannot invest time, resources and efforts into enabling hardware physics in our games. On the other hand, the consumers-the gamers- simply cannot buy physX cards because there are no top rated titles with optimized hardware physics.

Exactly how Ageia guys are coping with this challenge is through the networking of its executives with game developers and this seems to be working.
Secondly, ageia is also providing its software physics engine PhysX, previously NovodeX, as a free API for any developer with the condition that the developer will optimize their games with its matching PhysX hardware. While its software competitor, the most famous games physics engine makers Havok are licensing their engine for some measureable loyalties.
Ageia has already made unreal guys use their physics engine in their game engine Unreal3 and game Gears of War that will eventually be ported to PC eventually. Unreal is also the most popular game engine for other game developers to license among modern engines. All in all, Ageia guys claim 100 titles by 60 game creators in 2007-08.

Ageia PhysX logo

As we all know, the big 3 console makers have released their next generation consoles in the last year with Nintendo Wii taking a bigger share of the cake, its really hard for even PC Gaming to survive. Plus for a gamer playing latest game means considerable updates. Updates that can cover the cost of buying all the major consoles!
A gamer now needs to buy a DirectX 10 graphics card + Microsoft Vista + Ageia PhysX card to get the fuller experience. I’m excluding the cost of upgrading memory and assuming that an average gamer already has atleast dual core machine with Creative X-Fi card.
Again making very little sense for a game developer to devote some energy to PhysX development than to optimize her game for all three consoles.
I’m really not sure how Ageia guys are fighting this challenge other than using their contacts. But I think that developing an itch in a game developer to loose their virginity ūüėČ of never before testing their game with flashy game physics effects in hardware is the way to go!

Considering the fact that all the software game physics engine market place was dominated by Havok physics engine as a monopolist before arrival of Ageia PhysX, Ageia has made quite a progress already. Havok seems to be on the run. It has produced a solution called HavokFX that is supposed to utilize nVidia triple SLI technology to do hardware physics but the solution hasn’t delivered and never launched out of beta state.
Ageia has already overcome this challenge at present.

SLI HavokFX physics

nVidia and DAAMIT (ATI/AMD) have provided some hasty solutions to counter the attack of Ageia PhysX cards by doing triple card solutions. Two cards to render graphics and 1 to do physics calculations. Agreed, dual card and its driver support for graphics cards is a step towards doing dual core graphics chips in next generation (GeForce 9 & Radeon X3) elegantly but against the proposition of buying just a console it stands no chance.
nVidia triple SLI is supposed to utilize the upcoming HavokFX API, I’m keeping my hands raised in prayer for its release, while DAAMIT triple CrossFire is supposed to use its own abstract layer of some kind that nobody understands let alone be using it.
However I personally believe, the place of hardware physics chip is on the graphics card to reduce latency between GPU and PPU communication and other such improvements.
This challenge again is overcome and the solution by competition is messy at best.

The least business challenge I suppose was to convince AIB, Add-In Board, partners to provide cheap cards based on Ageia PPU and distribute them properly.
This is a challenge no more ūüôā

Asus Ageia PhysX card

Technical Challenges::
The most important technical challenge I believe was to keep performance of hardware PhysX enabled system equal to that of a disabled one. This depends a lot on the game developers. They can write pretty bad code etc.

However, I would like to emphasize the fact that NovodeX game physics engine was not widely popular among game developers as opposed to Havok. It definetely lacked some features but its performance delta cannot be judged.
Initial benchmarks revealed a negative delta of 10%-20%. So a drop in performance to see eye candy in games isn’t the best choice.
Also initial games only embedded eye candy physics hardware effects, like wild explosions and building debris falling. Advanced effects like, fluid motion, avalanche were left for next generation games [and probably next gen PhysX PPU]. Morover, initial physics also seemed rather random as opposed to the prevailing belief.
Ageia is still fighting hard to overcome this challenge but this seems to have found a real thorn in the neck from the end-user percepective. This is also backed up by the fact that gamers who invest in such luxuries as Ageia harware PhysX cards are already very performance sensitive.

Clothe Animation

…and then there was the driver. For each harware released in this world, support by a powerful driver enhances the user experience by many folds. The driver team released driver update from time to time and in each update new feature set of the PPU were unlocked. Since a lot many users did not install the cards therefore the stability issue cannot be evaluated.
…then there was the new windows launch. MS Vista with its *brand new* driver model has diminished the end-user experience in that its 5 months already since its release and even the huge and powerful driver teams of nVidia and DAAMIT have not been been to release driver-for-Vista-with-DirectX10-in-32-bit mode. Heck! DAAMIT has not even launched the graphics chips for DirectX 10.
Then after DX10 hardware+driver release there will be conflicts of Ageia PhysX drivers and these GPU drivers.
Also keep in mind that Ageia has not released the Vista drivers and doesn’t seem to be at all interested in 64-bit drivers.
I think 64-bit gaming is not getting popular anytime before 2010.
Since this is an ongoing challenge and Ageia is making inroads into it we should consider it a challenge forever.

PhysX driver options

This challenge is totally invisible but its very important. The game creators are not just a bunch of code junkies but also cool content creators. In the game development, the 3D artistic content and making the code to execute it efficiently is a long lasting challenge.
Here comes physics effects and the content creators need to artistically design these effects and the code junkies need to enable these effects using PhysX API and optimize them later on.
The only thing Ageia is doing right now is to work closely with game developers on a person-to-person basis and helping them out. But in the longer run they need to develop best-of-breed tools to enable development of aritistic content into games productively.

Blood Burning

We all forgot that Ageia is also a hardware chip designing firm. So they have dedicated chip engineers that are working on next gen PhysX chips. The initial chip was 125 million transistor on 130nm process design. Now they are on the road to a process shrink to 80nm.
With the shrink of this magnitude they are bound by rule of thumb to increase transistor count that either means more PhysX hardware features or more precision in calcualtions of existing features or both. Furthermore, the long awaited shift to PCI-Express standard is coming. We still have to see if they will support PCIe 1X/4X or 16X and will they be supporting PCI-Express 2.0 standard in time or not?
Ageia is working on this challenge and they are about to launch something soon.

Shrapnels of Explosion

Last and the least, there will always be a need to improve the design and power by adding more effects into the PhysX software game engine. This is again an ongoing process.
Ageia doesn’t seem too much focused on this after considering all the above challenges but once the things stabilize a lot in say a year or year-and-a-half then the engine may be upgraded.

However, there is a caveat here. The only and the more popular physics engine competitor Havok if enables drastically astonishing physics effects using its API then the landscape will alter altogther. Game developers will keep their focus on Havok and GPU creators will focus on triple card technologies to beat Ageia.
Here’s my analysis on Ageia and kudos to the Ageia management team to come this far! Lets see what happens in two years time.

Stay tuned!

Apple’s new buying frenzy: 3DLabs

There is that latest rumour among many regarding Apple and its rumour mill related to its infatuation with user interface/graphics.
Apple is in talks of buying 3DLabs from Creative.

The Reasons given by the officials of 3DLabs are mentioned below while the Apple officials denied any statement as usual:

Reason #1: 3DLabs had a very bad time with creative not taking their workstation class graphics offering to the consumers due to its shyness of the waters of competing against the giants like nVidia and ATI (now AMD). The decision by Creative to make 3DLabs work only in portable handset graphics proved to be the last nail in the coffin. Last October 3DLabs showed its intent to the world of splitting from Creative, with Creative agreeing to this proposition, to be taken over by some serious graphics firm.

Reason #2: Apple is in dire need of increasing its presence in the game industry. Mac OSX is the last platform for even the big game developers to port their top titles to. While OCX pose as thee most polished platform when it comes to usability and media content creation and consumption. With 3DLabs come the expertise of the creators of OpenGL graphics library as well as hardware support of all present and future endeavours of Apple in video game ventures.

Reason #3: 3DLabs misses a serious player like Apple to support its vision and Apple is in the safest financial position to get its feet wet in the only arena of the user-friendly OS platform that it has not dared to step in.

Need I say more?