Tuesday, May 8, 2007

Yahoo! India Maps

Yahoo! has 170 cities, 4785 towns and more than 220,000 village maps that's searchable on features that can be address, street name or things like ATM or hospital locations. Yahoo! alongwith CE Infosystems have produced the maps. Ever since getting involved in GIS services this is one thing I have been hearing about such detailed maps in Indian context. Looks like it's happening finally. GIS is certainly going to become mainstream soon.

Another offering from Yahoo that's interesting is "our city". This site aggregates dynamic content about specific cities, available currently for 20 of them.

Really interesting!!!

Tuesday, April 24, 2007

Computing Power On Tap- $1 per CPU Hour!!

Sun (www.sun.com) has announced a grid/utility computing resource. They have named it the network.com (www.network.com). Network.com is backed by a large grid of computers running Solaris10 and has a Network.com Applications catalog.

What happens is one could utilize this huge high power computing and pay for the time of actual use. That makes computing super power available to large community of users without making it necessary to buy and own large resources. Applications available cover the following right now.
  • Computational Mathematics: Numerical computing, Linear Programming and Statistical computing and graphics applications.
  • Computer Aided Engineering: Several Finite Element Analysis packages.
  • Electronic design Automation: Spice simulation.
  • Life Sciences:Large selection of Life Science related applications.
  • General :3D rendering and environmental modeling packages.
The portal also allows software vendors to deploy their applications for "pay per use". They are allowed to charge a fee that they think is appropriate. Users too could upload their application.

Right now it is available to users in US only but should get extended to others soon.

Tuesday, April 10, 2007

IPTV Revenue Set To GRow To $39.1 bn By 2011

According to iSuppli, IPTV market, subscription revenue, is set to grow from $960.5 million in 2006 to $39.1 billion by the year 2011. That's a 40 times increase!!

Typically interactivity, personalization and integration of voice, data and value added services are the attraction for consumers according to the analyst.

Wide-band media to home is a prerequisite for a reasonable consumer experience. As this becomes available more and more, these services become feasible and more services providers jump in. Telcos have wideband media on the ground and are building more and more. It's a natural that they would jump into this triple play bandwagon. Other players are the provders of cable TV services, again with fairly wideband media in the ground. What they need to be able to provide is a return path to enable interactivity. They are addressing this through cable modem developments/standardizations.

What happens to the satellite TV! Besides technological features it is the content that attracts consumers. Satellite TV seems to have taken the approach of providing unique programming!

Overall consumers are in for an interesting time, whichever way this plays out.

Friday, April 6, 2007

ASUS Announces Launch of R2H UMPC

Asus announced the launch in India on the 29th of Mar. Truly ultra mobile, one of the first production systems has really nice features for traveling persons. It's a full featured PC to start with of course.











Real unique touches are a high resolution web cam and a GPS. Applications supporting personal navigation are included. Good web cam helps in staying in touch with home base. Biometric authentication (fingreprint) ensure it is not easily compromised. Usual features of handwriting recognition etc are part of the package. 7" color display is a nice size for this format. Controls are on two sides of the display for easy manipulation and a full featured keyboard can be made available in a soft format on the display. Whole thing is driven by Windows XP, Tablet PC edition. Following are the specifications for the gadget.
R2H Specification
Intel® Celeron® M ULV Processor (900MHz, low power consumption)
Genuine Windows® XP Tablet PC Edition
768MB DDRII 667 DRAM
7" WXGA touch screen LCD, ASUS Splendid Video Intelligent Engine
60 GB PATA 1.8" HDD 4200PRM
Bluetooth® V2.0 + EDR, 3x USB, 1x SD Card-Reader,
1x GPS, 1x Finger Print Reader
23.4 x 13.3 x 2.8cm, 830 gm

More information is available on the ASUS website http://in.asus.com/


Tuesday, April 3, 2007

Building Information Model And All That

It's quite interesting to see how similar concepts take hold in various areas of human endeavor! Back in the early nineties I was part of a set up that created GIS databases. The Automated Mapping/Facilities management based on these databases was a big deal already in the US and we Indians were trying to set a beachhead back then. Real nice twist to this database was that it linked together graphics elements and standard RDBMS entries together. And that gave a complete new dimension to the information, the way it was being used. One could have a map of a place and have lot of information captured for the buildings in it, the roads and so on. One huge application was for the utilities, be it electric, gas or water or whatever. If all the relevant information about a electric feed, the poles, switches etc could be captured and then queried, it becomes a very useful tool during design, construction as well as operational lifetime. During this phase it helps with the maintenance activities too.

Then when later in the nineties I got involved with CAD/CAM slowly PLM or the product life cycle management became all the rage. besides spanning the full product life cycle, it was to become a tool for concurrent design, collaboration etc.Web access makes that even easier. That's how products like Windchill fro the stable o PTC that works with Pro/E became quite the rage.

Now in this new millenium I am into architectural CAD services industry and Building Information Model (BIM) is all the rage. In fact we are differentiating our services by positioning ourselves as a BIM vendor rather than just CADD! By attching pieces of information to the design model we make this model as well as the information a useful tool through the life cycle of the building being designed. Desin is quicker and less error prone, information is useful in the construction as well as occupancy phase. Same way that the GIS or the PLM database is so useful. A simple facility management system can make life so much more easier during occupation phase, helping with the up-keep to so much detail!!

Wednesday, March 14, 2007

Issues Of Software development

I am completely convinced that in today's scenario one or the other "agile" methodology is the way to go. However, whatever you do some common problems appear. Mostly people related. Would it not be wonderful if we did not have to deal with such issues! But there's no way, people have to develop the stuff. I'll basically be discussing about 3 blog posts that look at these issues.

James Shore talks about how software have to be "done done" to be useful. One of the cornerstone of agile delivery is that we attempt to deliver fairly well cooked stuff every release cycle.Problems arise as everybody's sense of closure is widely different. He has a checklist that tells us when it's "done done". A quick look below.
  • All unit, integration and customer tests are finished (tested)
  • All code written(coded)
  • Code refactored to team's satisfaction (designed)
  • Software being released is completely integrated UI, database etc. (Integrated)
  • Build script includes any new modules (builds)
  • Build script includes the story in the automated installer (Installs)
  • Build script updates database schema if necessary and installer migrated data when necessary (Migrates)
  • Programmers, testers & customers have reviewed the story for bugs and UI glitches (Reviewed)
  • All known bugs have been fixed or rescheduled as newer stories (fixed)
  • Customers agree that the story is completed (Accepted)
This is really a chapter of his upcoming book "The Art Of Agile Development". Rest of the chapter is devoted to how to fulfill all the above criteria.

The second issue is taken from another blog "perils of pair programming" by Matt Stephens. Pair programming is an essential framework for the extreme programming methodology. When the capabilities of both programmers are nearly matched, pairing works well. But pairing like novice-novice or novice-expert does not work well at all. Even when in novice-expert like pairing the expert is a willing mentor, productivity is likely to suffer. It is a difficult optimization to achieve in general in a team then. Add to that not everybody is inclined to sit with another at the programming terminal and be extrovert enough to make it work. It is difficult to select people by programming orientation, and pick those who are pair-programming oriented!

A third angle tackled in Kelly Waters' blog "what if my Agile colleague won't play ball?". Kelly is an experienced project manager and discusses various ways of dealing with the situation". The essence being counseling first, unlike the team putting in peer pressure to make the misfit leave. Change is phobic to many people and they may not even realize it. Hence the counseling and discussions based approach. Obviously when all else fails, surgery has to be resorted to!!

Tuesday, February 13, 2007

80 Core Processor !!!???


Intel has announced a research chip that contains 80 (yes, eighty!!) cores. Each is a dual floating point execution unit. That's a record of maximum integration! This achieves a performance of 1 teraflops at 3.16 Ghz. That's a supercomputer performance that can make supercomputer on a desk possible when commercially available.


All that performance at an energy expense of 62 watts only, lower than some of the PC chips today! As I have mentioned couple of times in my past posts, this is what's going to happen. Ever more performance at as low a consumption as is possible. Otherwise the way the need for more performance, whether through such high performance devices or a multitude of servers, increases the energy consumption and corresponding cooling requirements will become unmanageable.

Only thing comparable has been a supercomputer built by Intel with 10,000 Pentium chips that reached teraflop performance yet consumed 500 KW of power! This was a supercomputer housed in Sandia National Labs of USA. Mind boggles just to think of possibilities when such power would be available for personal use! Some of the applications being talked about are real time speech recognition, multimedia data mining, photo-realistic game etc. That would/could lead to immediate speech based interaction with the computer like the HAL of 2001 A Space Odyssey! Digging out a photo of some one with a smile on his face in contrast to his frowning! Photo realism particularly in real time animation would add a totally different dimension to gaming experience!

With several processors running together, system bus saturates very soon. Caches help but even then limits are reached very soon, as numbers increase. Memory interface speeds, data management between these cores, keeping them current are some of the problem areas. Particularly the interface speeds possible on this system bus.

Intels' stated aim is to research these areas with a chip like this. Innovations generated in the project as per Intel are as follows,

  1. Rapid design – The tiled–design approach allows designers to use smaller cores that can easily be repeated across the chip. A single–core chip of this size (100 million transistors) would take roughly twice as long and twice as many people to design.
  2. Network on a chip – In addition to the compute element, each core contains a 5–port messaging passing router. These are connected in a 2D mesh network that implement message–passing. This mesh interconnect scheme could prove much more scalable than today’s multi–core chip interconnects, allowing for better communications between the cores and delivering more processor performance.
  3. Fine–grain power management – The individual compute engines and data routers in each core can be activated or put to sleep based on the performance required by the application a person is running. In addition, new circuit techniques give the chip world–class power efficiency—1 teraflops requires only 62W, comparable to desktop processors sold today.
  4. And other innovations – Such as sleep transistors, mesochronous clocking, and clock gating.