Sony, IBM and Toshiba developed the Cel chip jointly and Sony went on to use them on the Playstation PS3. With a co-ordinating processor and eight execution units in each chip, the are actually well suited to churn things up a lot. Sony needed to process highspeed processing of hi-resolution of graphics. While doing so fortunately they left it as a open platform.
Now comes news that Gaurav Khanna, Professor of Astorphysics at the University of Mass. at Dartmouth put together eight of these machines and milking it for supercomputer performance; all for an investment of $4000 and some testing time. That really is good news for people looking for performance at a reasonable price!!
Sunday, November 25, 2007
1 Gb RAMs become mainstream
512 Mb memory chips has been the mainstream component in use so long. By October however, the next leap seems to have happened. 1 Gbit chips have taken over. The hardware manufacturers made the switch in anticipation of Vista OS becoming catching up fast on popularity. Vista is memory hungry and the switch seemed logical.
There has been strange happenings however in the PC scene. Most PC manufacturers started pre-loading their machines with Vista soon after the release. However, soon news started trickling in that enterprises were pressing manufacturers to be able to switch to XP. Norm now has become that, XP disks that can be installed without getting back to MS for license keys are being shipped with systems. Most new purchase in the enterprise segment reportedly were falling back on XP.
Kind of strange!!
There has been strange happenings however in the PC scene. Most PC manufacturers started pre-loading their machines with Vista soon after the release. However, soon news started trickling in that enterprises were pressing manufacturers to be able to switch to XP. Norm now has become that, XP disks that can be installed without getting back to MS for license keys are being shipped with systems. Most new purchase in the enterprise segment reportedly were falling back on XP.
Kind of strange!!
Sunday, October 14, 2007
Significant Technologies Of Next Year
Gartner has announced its annual heads up about technologies/trends to watch out for in the coming year. The list is as below.
1. Green IT
2. Unified communications
3. Business process management
4. Metadata management
5. Virtualization
6. Mashups
7. The web platform
8. Computing fabric
9. Real world web
10. Social software
There’s increasing pressure about being energy efficient and environment friendly in general. This will be significant. Trends are already visible. All initiatives/activities in this area need to be watched closely. There’s also a threat of regulations coming into force sooner or later.
We have been hearing about it quite a lot for some time. Things are happening and its coming closer to widespread use. What newer data types/message types may get included are the security videos too. How that helps is you could look at them to decide the character of retail traffic at a store for example.
Business process management is a clutch of things happening. Mainly so that business processes are understood clearly through modeling as necessary. As the Gartner analyst put it, this may include model-driven development, content and document management, collaboration capabilities, system connectivity, business intelligence activity monitoring and management, rules and systems management.
Metadata management was bound to come along. As ever increasing data is acquired for analysis, it was obvious some for of metadata management would soon be necessary to make managing the underlying data manageable.
Virtualization enters the second generation. Manufacturers are shipping products with necessary middleware along with the OS. Virtualization becomes even more important in that one could mirror the production system for disaster recovery.
Mashups or combining things from multiple websites is going to be increasingly important. Building the capability of a service so that mashups can be created easily will become increasingly important. In the same manner the web as the platform is going to be increasingly visible too. Meaning you would use, create web services increasingly too.
Memory, processors, I/O becomes part of a fabric, that can combine/share these resources as required would be an improvement on the current structure of data centres.
Computing experience possible through the ever present mobile devices is what has come to be known as the real-world web. Thanks to this information availability one could locate a range of information; be it travel information or even where to locate a piece of grocery on the shelves of a store.
And then finally the social software. Podcasts, blogs, wikis are all the tools of this particular area that fosters social networking. Well, your individual priority for the coming year could be different, do keep an eye open for what’s happening in these areas!
1. Green IT
2. Unified communications
3. Business process management
4. Metadata management
5. Virtualization
6. Mashups
7. The web platform
8. Computing fabric
9. Real world web
10. Social software
There’s increasing pressure about being energy efficient and environment friendly in general. This will be significant. Trends are already visible. All initiatives/activities in this area need to be watched closely. There’s also a threat of regulations coming into force sooner or later.
We have been hearing about it quite a lot for some time. Things are happening and its coming closer to widespread use. What newer data types/message types may get included are the security videos too. How that helps is you could look at them to decide the character of retail traffic at a store for example.
Business process management is a clutch of things happening. Mainly so that business processes are understood clearly through modeling as necessary. As the Gartner analyst put it, this may include model-driven development, content and document management, collaboration capabilities, system connectivity, business intelligence activity monitoring and management, rules and systems management.
Metadata management was bound to come along. As ever increasing data is acquired for analysis, it was obvious some for of metadata management would soon be necessary to make managing the underlying data manageable.
Virtualization enters the second generation. Manufacturers are shipping products with necessary middleware along with the OS. Virtualization becomes even more important in that one could mirror the production system for disaster recovery.
Mashups or combining things from multiple websites is going to be increasingly important. Building the capability of a service so that mashups can be created easily will become increasingly important. In the same manner the web as the platform is going to be increasingly visible too. Meaning you would use, create web services increasingly too.
Memory, processors, I/O becomes part of a fabric, that can combine/share these resources as required would be an improvement on the current structure of data centres.
Computing experience possible through the ever present mobile devices is what has come to be known as the real-world web. Thanks to this information availability one could locate a range of information; be it travel information or even where to locate a piece of grocery on the shelves of a store.
And then finally the social software. Podcasts, blogs, wikis are all the tools of this particular area that fosters social networking. Well, your individual priority for the coming year could be different, do keep an eye open for what’s happening in these areas!
Significant Technologies Of Next Year
Gartner has announced its annual heads up about technologies/trends to watch out for in the coming year. The list is as below.
1. Green IT
2. Unified communications
3. Business process management
4. Metadata management
5. Virtualization
6. Mashups
7. The web platform
8. Computing fabric
9. Real world web
10. Social software
There’s increasing pressure about being energy efficient and environment friendly in general. This will be significant. Trends are already visible. All initiatives/activities in this area need to be watched closely. There’s also a threat of regulations coming into force sooner or later.
We have been hearing about it quite a lot for some time. Things are happening and its coming closer to widespread use. What newer data types/message types may get included are the security videos too. How that helps is you could look at them to decide the character of retail traffic at a store for example.
Business process management is a clutch of things happening. Mainly so that business processes are understood clearly through modeling as necessary. As the Gartner analyst put it, this may include model-driven development, content and document management, collaboration capabilities, system connectivity, business intelligence activity monitoring and management, rules and systems management.
Metadata management was bound to come along. As ever increasing data is acquired for analysis, it was obvious some for of metadata management would soon be necessary to make managing the underlying data manageable.
Virtualization enters the second generation. Manufacturers are shipping products with necessary middleware along with the OS. Virtualization becomes even more important in that one could mirror the production system for disaster recovery.
Mashups or combining things from multiple websites is going to be increasingly important. Building the capability of a service so that mashups can be created easily will become increasingly important. In the same manner the web as the platform is going to be increasingly visible too. Meaning you would use, create web services increasingly too.
Memory, processors, I/O becomes part of a fabric, that can combine/share these resources as required would be an improvement on the current structure of data centres.
Computing experience possible through the ever present mobile devices is what has come to be known as the real-world web. Thanks to this information availability one could locate a range of information; be it travel information or even where to locate a piece of grocery on the shelves of a store.
And then finally the social software. Podcasts, blogs, wikis are all the tools of this particular area that fosters social networking. Well, your individual priority for the coming year could be different, do kep an eye open for what’s happening n these areas!
1. Green IT
2. Unified communications
3. Business process management
4. Metadata management
5. Virtualization
6. Mashups
7. The web platform
8. Computing fabric
9. Real world web
10. Social software
There’s increasing pressure about being energy efficient and environment friendly in general. This will be significant. Trends are already visible. All initiatives/activities in this area need to be watched closely. There’s also a threat of regulations coming into force sooner or later.
We have been hearing about it quite a lot for some time. Things are happening and its coming closer to widespread use. What newer data types/message types may get included are the security videos too. How that helps is you could look at them to decide the character of retail traffic at a store for example.
Business process management is a clutch of things happening. Mainly so that business processes are understood clearly through modeling as necessary. As the Gartner analyst put it, this may include model-driven development, content and document management, collaboration capabilities, system connectivity, business intelligence activity monitoring and management, rules and systems management.
Metadata management was bound to come along. As ever increasing data is acquired for analysis, it was obvious some for of metadata management would soon be necessary to make managing the underlying data manageable.
Virtualization enters the second generation. Manufacturers are shipping products with necessary middleware along with the OS. Virtualization becomes even more important in that one could mirror the production system for disaster recovery.
Mashups or combining things from multiple websites is going to be increasingly important. Building the capability of a service so that mashups can be created easily will become increasingly important. In the same manner the web as the platform is going to be increasingly visible too. Meaning you would use, create web services increasingly too.
Memory, processors, I/O becomes part of a fabric, that can combine/share these resources as required would be an improvement on the current structure of data centres.
Computing experience possible through the ever present mobile devices is what has come to be known as the real-world web. Thanks to this information availability one could locate a range of information; be it travel information or even where to locate a piece of grocery on the shelves of a store.
And then finally the social software. Podcasts, blogs, wikis are all the tools of this particular area that fosters social networking. Well, your individual priority for the coming year could be different, do kep an eye open for what’s happening n these areas!
Thursday, October 11, 2007
AMD Quad cores
AMD announced their quad core processor versions end of last month. There are some nine versions introduced positioned for the server market. These versions operate around the 2 GHz mark and operate at under 100watt power budget. That should mean some competition for Intel!!
Saturday, September 29, 2007
Downgrade Rather Than Upgrade!!!!
Microsoft is obviously pushing Windows Vista very hard. Vista Business and Ultimate versions must be the ultimate in Vista flavors on offer. Then how come MS is making it easy for business users to downgrade to known and dependable XP flavor of the last generation of Windows family!
MS is making it easy for PC box shippers with prepackaged Vista PC's to ship XP discs along with the sytem. They are preactivated, i.e even a call to support to get an activation code would be necessary.Apparently the EULA, end user license agreement has always had that provision. Fujitsu has been the most aggressive among the box movers so far. now others are going to do it too.
Is it just resistance to change? Does not look like though. Looks more like an attempt by enterprises to stick with an old faithful. So what's at stake here! Is it that Vista would need more training time? Or there are reliability, availability related issues? Who knows! Only thing one could be sure of is that MS is being pragmatic about it, though the expected protestation that this does not mean any admission that Vista has any shortcoming is being repeated.
MS is making it easy for PC box shippers with prepackaged Vista PC's to ship XP discs along with the sytem. They are preactivated, i.e even a call to support to get an activation code would be necessary.Apparently the EULA, end user license agreement has always had that provision. Fujitsu has been the most aggressive among the box movers so far. now others are going to do it too.
Is it just resistance to change? Does not look like though. Looks more like an attempt by enterprises to stick with an old faithful. So what's at stake here! Is it that Vista would need more training time? Or there are reliability, availability related issues? Who knows! Only thing one could be sure of is that MS is being pragmatic about it, though the expected protestation that this does not mean any admission that Vista has any shortcoming is being repeated.
Saturday, September 1, 2007
Interesting 64 core Processors
Sun recently announced the T2 CPU. It has 8 cores, each capable of running 8 threads simultaneously. That’s like 64 processing threads running in parallel! All that happens with only 95W energy consumption!! That translates into a awesome processing power in servers helped my Solaris that makes development easier.
Now if we had a true 64 core processor that worked for less than 20 watts!! Hat would be amazing!!!!!!!!!!!
Tilera announced recently that they have such a chip called the Tile 64. Please read detailed story here. One of the major problems with multi core chips( I have blooged about this before) is the main system bus on the chip which starts to saturate with data flow despite multi level caches in the architecture. Tilera claims that they have overcome this problem so that truly massively parallel chips would now be possible. CPU cores in the Tile 64 chip exchange data through a mesh architecture and no such bottleneck exists. A switching matrix interconnects processors and there are separate memory controllers.
The Tile 64 processor has a RISC architecture and offered at clock speeds of 600 MHz and 1GHz speeds. He switches can transmit data to 4 of its neighbors at 500 Gbps rates. These processors are positioned for usage in routers, switches, appliances, video conferencing and Set top boxes.
Now if we had a true 64 core processor that worked for less than 20 watts!! Hat would be amazing!!!!!!!!!!!
Tilera announced recently that they have such a chip called the Tile 64. Please read detailed story here. One of the major problems with multi core chips( I have blooged about this before) is the main system bus on the chip which starts to saturate with data flow despite multi level caches in the architecture. Tilera claims that they have overcome this problem so that truly massively parallel chips would now be possible. CPU cores in the Tile 64 chip exchange data through a mesh architecture and no such bottleneck exists. A switching matrix interconnects processors and there are separate memory controllers.
The Tile 64 processor has a RISC architecture and offered at clock speeds of 600 MHz and 1GHz speeds. He switches can transmit data to 4 of its neighbors at 500 Gbps rates. These processors are positioned for usage in routers, switches, appliances, video conferencing and Set top boxes.
Wednesday, August 15, 2007
crossroads
Life has a strange way of bringing you down to some crossroads when you have to decide the direction you want to take from then on. I am at one such. Having reached 60 years of age I am at a traditional point one retires from professional life. I am just out of an assignment and could easily take that path.
I do, however, feel I have quite a mileage left still. I have always wanted to teach. My father and grandfathers have been teachers, may be that's why. Maybe I understand things well and then love to explain that to others! I have found that to be true all along my student days as well as during my professional life.
During student years loved to lead informal study groups that always formed. Similarly jumped at an opportunity to train others whenever such opportunities arose during working years. I really, truly enjoyed the experience too. So why not!!
I do, however, feel I have quite a mileage left still. I have always wanted to teach. My father and grandfathers have been teachers, may be that's why. Maybe I understand things well and then love to explain that to others! I have found that to be true all along my student days as well as during my professional life.
During student years loved to lead informal study groups that always formed. Similarly jumped at an opportunity to train others whenever such opportunities arose during working years. I really, truly enjoyed the experience too. So why not!!
Monday, July 9, 2007
teragrid the superpowered supercomputing grid
National center for supercomputing associations announced a super powered computing grid. While 280 teraflops of computing power and 20 petabytes of storage is impressive, it is the user interface that is supposed to be the crucial element for users.
Is anyone going to need more computing power! Well, it is always an application coming along that needs more resources than is available and thus catching up with all the available resources. It has happened always in computing history and I'm sure it is going to happen soon on this "teragrid" too. Until then it'll make the life of researchers in advanced areas easier quite a bit.
Is anyone going to need more computing power! Well, it is always an application coming along that needs more resources than is available and thus catching up with all the available resources. It has happened always in computing history and I'm sure it is going to happen soon on this "teragrid" too. Until then it'll make the life of researchers in advanced areas easier quite a bit.
Friday, June 15, 2007
Wireless World: Approaching the Wireline Barrier
The biggest problem with wireless networks has been the fact that we are dealing with a limited bandwidth broadcast medium (that and security, which is a completely different story, though not completely unrelated). The best technology widely available at this time is IEEE 802.11a & g (Wi-Fi) which both work at a maximum rate of 54Mbps (though perceived data rate is much lower and also the data rate available per user when there are a number of users sharing the channel as is typically the case in any network). Therefore they are nowhere close to current Ethernet (100Mbps/1Gps). Moreover, the current brand of Ethernet is not exactly a broadcast technology but a switched technology, therefore not all users share the 100 Mbps or 1 Gbps (the sharing is quite complicated and depends essentially on the switch - I'll discuss this in a future post).
The solution seems to be the upcoming IEEE 802.11n standard which makes use of MIMO technology to achieve much higher bandwidths (claims are being made for upto >500 Mbps, but that for me literally means that we can expect decent 100/200 Mbps performance). That is going to be an interesting step towards reducing wires.
In related news, one of the things that limits any Wireless technology is power, and with Laptops etc we eventually have a situation where we need wires for power. However a group at MIT seems to be coming up with a solution to that problem. If this comes through then this really is the step towards a wireless world. Read about it here.
The solution seems to be the upcoming IEEE 802.11n standard which makes use of MIMO technology to achieve much higher bandwidths (claims are being made for upto >500 Mbps, but that for me literally means that we can expect decent 100/200 Mbps performance). That is going to be an interesting step towards reducing wires.
In related news, one of the things that limits any Wireless technology is power, and with Laptops etc we eventually have a situation where we need wires for power. However a group at MIT seems to be coming up with a solution to that problem. If this comes through then this really is the step towards a wireless world. Read about it here.
Incredible Journey of 800 ps
The Programming Logic Designline features a wonderful article called the incredible journey of an 800 picosecond period. Now thats 800 picoseconds = 0.8 nanoseconds = 0.0008 microseconds = 0.0000008 milliseconds = 0.0000000008 seconds, which truly boggles the mind when you consider all that is happening during this time.
Read the article here it's more than a useful read, in fact it is fairly informative and even entertaining.
Read the article here it's more than a useful read, in fact it is fairly informative and even entertaining.
Microsoft's New Surface Computer
A lot of people in the computing community tend to undermine Microsoft, often saying that Microsoft has done nothing of note and exists mostly by copying or living off of other people's ideas. Although they have a point, in that MS is famous for taking other people's ideas; they also miss the point that MS is great at marketing products (even if it is through aggressive brand placement and tie-ups - everybody major business does it, including Mr. Steve "Fashionable" "Apple" Jobs) and also developing them further. From time to time, Microsoft has also come up with good ideas (which in time others have followed). It can be argued that those ideas were in fact taken from other smaller companies etc which they bought out, but that would also be true of Apple or any other competitor.
The latest offering from Microsoft is the amazing new surface computer. Here is a youtube video showinf what it's capable of (in typical MS fashion there is at least the one blemish :) ) -
Here are some links to find out more about it:
http://www.microsoft.com/surface/
http://www.techcrunch.com/2007/05/29/microsoft-announces-surface-computer/
Certainly looks like the future of computing, especially for Artists, Designers, Engineers and the like. The Tablet PC (another concept popularized by MS) helps to some extent and this is the logical next step. We are probably a few years away from a feasible and affordable variant though. But this is promising. After a long time, something in the Computer world that excites me.
Sidenote: A lot of people exist in that Mac vs. Windows debate - but it is really a useless debate - because after all the fight over who came up with what is pretty pointless since most personal computer (and I include Macs in that category) ideas originated at that fantastic R&D center known as Xerox PARC. Of course, people outside of the Computing industry are unaware of this and that's ok, but what surprises me is how easily they are willing to fight for their respective loyalties and also how a lot of people who are from a Computing background are unaware of this.
The real debate is Windows vs. Linux or to be exact proprietary vs. free, but that debate is out of the scope of this post. Suffice to say, that in the honest opinion of anyone who deals with computers and loves them the answer is simple: It's all good.
Whether Microsoft or Apple should be stopped from doing the things they do is part of bigger question, that of Capitalism and what is acceptable there. The rich always persecute the poor, and that is unacceptable.
But consider this: that the Gates foundation is the largest Independent Charity Organization in the World. Where would that be without Microsoft?
The latest offering from Microsoft is the amazing new surface computer. Here is a youtube video showinf what it's capable of (in typical MS fashion there is at least the one blemish :) ) -
Here are some links to find out more about it:
http://www.microsoft.com/surface/
http://www.techcrunch.com/2007/05/29/microsoft-announces-surface-computer/
Certainly looks like the future of computing, especially for Artists, Designers, Engineers and the like. The Tablet PC (another concept popularized by MS) helps to some extent and this is the logical next step. We are probably a few years away from a feasible and affordable variant though. But this is promising. After a long time, something in the Computer world that excites me.
Sidenote: A lot of people exist in that Mac vs. Windows debate - but it is really a useless debate - because after all the fight over who came up with what is pretty pointless since most personal computer (and I include Macs in that category) ideas originated at that fantastic R&D center known as Xerox PARC. Of course, people outside of the Computing industry are unaware of this and that's ok, but what surprises me is how easily they are willing to fight for their respective loyalties and also how a lot of people who are from a Computing background are unaware of this.
The real debate is Windows vs. Linux or to be exact proprietary vs. free, but that debate is out of the scope of this post. Suffice to say, that in the honest opinion of anyone who deals with computers and loves them the answer is simple: It's all good.
Whether Microsoft or Apple should be stopped from doing the things they do is part of bigger question, that of Capitalism and what is acceptable there. The rich always persecute the poor, and that is unacceptable.
But consider this: that the Gates foundation is the largest Independent Charity Organization in the World. Where would that be without Microsoft?
Tuesday, May 8, 2007
Yahoo! India Maps
Yahoo! has 170 cities, 4785 towns and more than 220,000 village maps that's searchable on features that can be address, street name or things like ATM or hospital locations. Yahoo! alongwith CE Infosystems have produced the maps. Ever since getting involved in GIS services this is one thing I have been hearing about such detailed maps in Indian context. Looks like it's happening finally. GIS is certainly going to become mainstream soon.
Another offering from Yahoo that's interesting is "our city". This site aggregates dynamic content about specific cities, available currently for 20 of them.
Really interesting!!!
Another offering from Yahoo that's interesting is "our city". This site aggregates dynamic content about specific cities, available currently for 20 of them.
Really interesting!!!
Tuesday, April 24, 2007
Computing Power On Tap- $1 per CPU Hour!!
Sun (www.sun.com) has announced a grid/utility computing resource. They have named it the network.com (www.network.com). Network.com is backed by a large grid of computers running Solaris10 and has a Network.com Applications catalog.
What happens is one could utilize this huge high power computing and pay for the time of actual use. That makes computing super power available to large community of users without making it necessary to buy and own large resources. Applications available cover the following right now.
Right now it is available to users in US only but should get extended to others soon.
What happens is one could utilize this huge high power computing and pay for the time of actual use. That makes computing super power available to large community of users without making it necessary to buy and own large resources. Applications available cover the following right now.
- Computational Mathematics: Numerical computing, Linear Programming and Statistical computing and graphics applications.
- Computer Aided Engineering: Several Finite Element Analysis packages.
- Electronic design Automation: Spice simulation.
- Life Sciences:Large selection of Life Science related applications.
- General :3D rendering and environmental modeling packages.
Right now it is available to users in US only but should get extended to others soon.
Tuesday, April 10, 2007
IPTV Revenue Set To GRow To $39.1 bn By 2011
According to iSuppli, IPTV market, subscription revenue, is set to grow from $960.5 million in 2006 to $39.1 billion by the year 2011. That's a 40 times increase!!
Typically interactivity, personalization and integration of voice, data and value added services are the attraction for consumers according to the analyst.
Wide-band media to home is a prerequisite for a reasonable consumer experience. As this becomes available more and more, these services become feasible and more services providers jump in. Telcos have wideband media on the ground and are building more and more. It's a natural that they would jump into this triple play bandwagon. Other players are the provders of cable TV services, again with fairly wideband media in the ground. What they need to be able to provide is a return path to enable interactivity. They are addressing this through cable modem developments/standardizations.
What happens to the satellite TV! Besides technological features it is the content that attracts consumers. Satellite TV seems to have taken the approach of providing unique programming!
Overall consumers are in for an interesting time, whichever way this plays out.
Typically interactivity, personalization and integration of voice, data and value added services are the attraction for consumers according to the analyst.
Wide-band media to home is a prerequisite for a reasonable consumer experience. As this becomes available more and more, these services become feasible and more services providers jump in. Telcos have wideband media on the ground and are building more and more. It's a natural that they would jump into this triple play bandwagon. Other players are the provders of cable TV services, again with fairly wideband media in the ground. What they need to be able to provide is a return path to enable interactivity. They are addressing this through cable modem developments/standardizations.
What happens to the satellite TV! Besides technological features it is the content that attracts consumers. Satellite TV seems to have taken the approach of providing unique programming!
Overall consumers are in for an interesting time, whichever way this plays out.
Friday, April 6, 2007
ASUS Announces Launch of R2H UMPC
Asus announced the launch in India on the 29th of Mar. Truly ultra mobile, one of the first production systems has really nice features for traveling persons. It's a full featured PC to start with of course.
Real unique touches are a high resolution web cam and a GPS. Applications supporting personal navigation are included. Good web cam helps in staying in touch with home base. Biometric authentication (fingreprint) ensure it is not easily compromised. Usual features of handwriting recognition etc are part of the package. 7" color display is a nice size for this format. Controls are on two sides of the display for easy manipulation and a full featured keyboard can be made available in a soft format on the display. Whole thing is driven by Windows XP, Tablet PC edition. Following are the specifications for the gadget.
R2H Specification
Intel® Celeron® M ULV Processor (900MHz, low power consumption)
Genuine Windows® XP Tablet PC Edition
768MB DDRII 667 DRAM
7" WXGA touch screen LCD, ASUS Splendid Video Intelligent Engine
60 GB PATA 1.8" HDD 4200PRM
Bluetooth® V2.0 + EDR, 3x USB, 1x SD Card-Reader,
1x GPS, 1x Finger Print Reader
23.4 x 13.3 x 2.8cm, 830 gm
More information is available on the ASUS website http://in.asus.com/
Real unique touches are a high resolution web cam and a GPS. Applications supporting personal navigation are included. Good web cam helps in staying in touch with home base. Biometric authentication (fingreprint) ensure it is not easily compromised. Usual features of handwriting recognition etc are part of the package. 7" color display is a nice size for this format. Controls are on two sides of the display for easy manipulation and a full featured keyboard can be made available in a soft format on the display. Whole thing is driven by Windows XP, Tablet PC edition. Following are the specifications for the gadget.
R2H Specification
Intel® Celeron® M ULV Processor (900MHz, low power consumption)
Genuine Windows® XP Tablet PC Edition
768MB DDRII 667 DRAM
7" WXGA touch screen LCD, ASUS Splendid Video Intelligent Engine
60 GB PATA 1.8" HDD 4200PRM
Bluetooth® V2.0 + EDR, 3x USB, 1x SD Card-Reader,
1x GPS, 1x Finger Print Reader
23.4 x 13.3 x 2.8cm, 830 gm
More information is available on the ASUS website http://in.asus.com/
Tuesday, April 3, 2007
Building Information Model And All That
It's quite interesting to see how similar concepts take hold in various areas of human endeavor! Back in the early nineties I was part of a set up that created GIS databases. The Automated Mapping/Facilities management based on these databases was a big deal already in the US and we Indians were trying to set a beachhead back then. Real nice twist to this database was that it linked together graphics elements and standard RDBMS entries together. And that gave a complete new dimension to the information, the way it was being used. One could have a map of a place and have lot of information captured for the buildings in it, the roads and so on. One huge application was for the utilities, be it electric, gas or water or whatever. If all the relevant information about a electric feed, the poles, switches etc could be captured and then queried, it becomes a very useful tool during design, construction as well as operational lifetime. During this phase it helps with the maintenance activities too.
Then when later in the nineties I got involved with CAD/CAM slowly PLM or the product life cycle management became all the rage. besides spanning the full product life cycle, it was to become a tool for concurrent design, collaboration etc.Web access makes that even easier. That's how products like Windchill fro the stable o PTC that works with Pro/E became quite the rage.
Now in this new millenium I am into architectural CAD services industry and Building Information Model (BIM) is all the rage. In fact we are differentiating our services by positioning ourselves as a BIM vendor rather than just CADD! By attching pieces of information to the design model we make this model as well as the information a useful tool through the life cycle of the building being designed. Desin is quicker and less error prone, information is useful in the construction as well as occupancy phase. Same way that the GIS or the PLM database is so useful. A simple facility management system can make life so much more easier during occupation phase, helping with the up-keep to so much detail!!
Then when later in the nineties I got involved with CAD/CAM slowly PLM or the product life cycle management became all the rage. besides spanning the full product life cycle, it was to become a tool for concurrent design, collaboration etc.Web access makes that even easier. That's how products like Windchill fro the stable o PTC that works with Pro/E became quite the rage.
Now in this new millenium I am into architectural CAD services industry and Building Information Model (BIM) is all the rage. In fact we are differentiating our services by positioning ourselves as a BIM vendor rather than just CADD! By attching pieces of information to the design model we make this model as well as the information a useful tool through the life cycle of the building being designed. Desin is quicker and less error prone, information is useful in the construction as well as occupancy phase. Same way that the GIS or the PLM database is so useful. A simple facility management system can make life so much more easier during occupation phase, helping with the up-keep to so much detail!!
Wednesday, March 14, 2007
Issues Of Software development
I am completely convinced that in today's scenario one or the other "agile" methodology is the way to go. However, whatever you do some common problems appear. Mostly people related. Would it not be wonderful if we did not have to deal with such issues! But there's no way, people have to develop the stuff. I'll basically be discussing about 3 blog posts that look at these issues.
James Shore talks about how software have to be "done done" to be useful. One of the cornerstone of agile delivery is that we attempt to deliver fairly well cooked stuff every release cycle.Problems arise as everybody's sense of closure is widely different. He has a checklist that tells us when it's "done done". A quick look below.
The second issue is taken from another blog "perils of pair programming" by Matt Stephens. Pair programming is an essential framework for the extreme programming methodology. When the capabilities of both programmers are nearly matched, pairing works well. But pairing like novice-novice or novice-expert does not work well at all. Even when in novice-expert like pairing the expert is a willing mentor, productivity is likely to suffer. It is a difficult optimization to achieve in general in a team then. Add to that not everybody is inclined to sit with another at the programming terminal and be extrovert enough to make it work. It is difficult to select people by programming orientation, and pick those who are pair-programming oriented!
A third angle tackled in Kelly Waters' blog "what if my Agile colleague won't play ball?". Kelly is an experienced project manager and discusses various ways of dealing with the situation". The essence being counseling first, unlike the team putting in peer pressure to make the misfit leave. Change is phobic to many people and they may not even realize it. Hence the counseling and discussions based approach. Obviously when all else fails, surgery has to be resorted to!!
James Shore talks about how software have to be "done done" to be useful. One of the cornerstone of agile delivery is that we attempt to deliver fairly well cooked stuff every release cycle.Problems arise as everybody's sense of closure is widely different. He has a checklist that tells us when it's "done done". A quick look below.
- All unit, integration and customer tests are finished (tested)
- All code written(coded)
- Code refactored to team's satisfaction (designed)
- Software being released is completely integrated UI, database etc. (Integrated)
- Build script includes any new modules (builds)
- Build script includes the story in the automated installer (Installs)
- Build script updates database schema if necessary and installer migrated data when necessary (Migrates)
- Programmers, testers & customers have reviewed the story for bugs and UI glitches (Reviewed)
- All known bugs have been fixed or rescheduled as newer stories (fixed)
- Customers agree that the story is completed (Accepted)
The second issue is taken from another blog "perils of pair programming" by Matt Stephens. Pair programming is an essential framework for the extreme programming methodology. When the capabilities of both programmers are nearly matched, pairing works well. But pairing like novice-novice or novice-expert does not work well at all. Even when in novice-expert like pairing the expert is a willing mentor, productivity is likely to suffer. It is a difficult optimization to achieve in general in a team then. Add to that not everybody is inclined to sit with another at the programming terminal and be extrovert enough to make it work. It is difficult to select people by programming orientation, and pick those who are pair-programming oriented!
A third angle tackled in Kelly Waters' blog "what if my Agile colleague won't play ball?". Kelly is an experienced project manager and discusses various ways of dealing with the situation". The essence being counseling first, unlike the team putting in peer pressure to make the misfit leave. Change is phobic to many people and they may not even realize it. Hence the counseling and discussions based approach. Obviously when all else fails, surgery has to be resorted to!!
Tuesday, February 13, 2007
80 Core Processor !!!???
Intel has announced a research chip that contains 80 (yes, eighty!!) cores. Each is a dual floating point execution unit. That's a record of maximum integration! This achieves a performance of 1 teraflops at 3.16 Ghz. That's a supercomputer performance that can make supercomputer on a desk possible when commercially available.
All that performance at an energy expense of 62 watts only, lower than some of the PC chips today! As I have mentioned couple of times in my past posts, this is what's going to happen. Ever more performance at as low a consumption as is possible. Otherwise the way the need for more performance, whether through such high performance devices or a multitude of servers, increases the energy consumption and corresponding cooling requirements will become unmanageable.
Only thing comparable has been a supercomputer built by Intel with 10,000 Pentium chips that reached teraflop performance yet consumed 500 KW of power! This was a supercomputer housed in Sandia National Labs of USA. Mind boggles just to think of possibilities when such power would be available for personal use! Some of the applications being talked about are real time speech recognition, multimedia data mining, photo-realistic game etc. That would/could lead to immediate speech based interaction with the computer like the HAL of 2001 A Space Odyssey! Digging out a photo of some one with a smile on his face in contrast to his frowning! Photo realism particularly in real time animation would add a totally different dimension to gaming experience!
With several processors running together, system bus saturates very soon. Caches help but even then limits are reached very soon, as numbers increase. Memory interface speeds, data management between these cores, keeping them current are some of the problem areas. Particularly the interface speeds possible on this system bus.
Intels' stated aim is to research these areas with a chip like this. Innovations generated in the project as per Intel are as follows,
- Rapid design – The tiled–design approach allows designers to use smaller cores that can easily be repeated across the chip. A single–core chip of this size (100 million transistors) would take roughly twice as long and twice as many people to design.
- Network on a chip – In addition to the compute element, each core contains a 5–port messaging passing router. These are connected in a 2D mesh network that implement message–passing. This mesh interconnect scheme could prove much more scalable than today’s multi–core chip interconnects, allowing for better communications between the cores and delivering more processor performance.
- Fine–grain power management – The individual compute engines and data routers in each core can be activated or put to sleep based on the performance required by the application a person is running. In addition, new circuit techniques give the chip world–class power efficiency—1 teraflops requires only 62W, comparable to desktop processors sold today.
- And other innovations – Such as sleep transistors, mesochronous clocking, and clock gating.
Tuesday, February 6, 2007
It's Not Done Until It Is Shippable
James Shore is writing a book on agile methodology "The Art Of Agile Development" and parts of that was being published in his blog. The latest is "Done Done". His point is that whatever is the methodology a piece of software is not "done" until it is shippable. In the sense it has been tested, detectable bugs have been ironed out etc.
Surprising how wide a range of meaning is attached to something being done by individuals. It can really be a huge range of completeness. Some would consider it done when he has finished coding and may be the piece runs, nobody knows if rightly or wrongly. Other end of the spectrum obviously is a tested, tried, polished piece of functionality.
I still remember reading an article on "sense of closure". It was precisely this issue being discussed. I have always maintained with my colleagues that when reporting something being complete the only test one can use is, is it ready to be shipped? Would somebody pay money to buy that piece of functionality. That, of course applies to any kind of product be it software or hardware or a mix of both in the form of embedded product.
Surprising how wide a range of meaning is attached to something being done by individuals. It can really be a huge range of completeness. Some would consider it done when he has finished coding and may be the piece runs, nobody knows if rightly or wrongly. Other end of the spectrum obviously is a tested, tried, polished piece of functionality.
I still remember reading an article on "sense of closure". It was precisely this issue being discussed. I have always maintained with my colleagues that when reporting something being complete the only test one can use is, is it ready to be shipped? Would somebody pay money to buy that piece of functionality. That, of course applies to any kind of product be it software or hardware or a mix of both in the form of embedded product.
Wednesday, January 31, 2007
GIS/GPS Really, Truly Mainstream Now!!
Time, Nov 20 issue, Asia edition features 3 of the most portable and affordable of GPS and combination devices as "Gift of the year". The three devices are Delphi NAV200, Mio H610 digiWalker and Garmin StreetPilot c550. All three are really truly portable, actually hand-held sized. Prices range from about $300 to $750 or so. Location/navigation has become ready for mass use. These can be used for street walks as well as for driving guidance. They come with preloaded maps, cute user interfaces and has added functionality such s MP3 player, phone, PDA etc.
As these are based on quite powerful processors additional functionality become easy. Could be flight directory, weather, traffic update and so on. I can imagine a time very soon when instead of a Lonley Planet country guide, tourists carrying one such device. The device would be multimedia capable and provide all the background information these guides provide!
As these are based on quite powerful processors additional functionality become easy. Could be flight directory, weather, traffic update and so on. I can imagine a time very soon when instead of a Lonley Planet country guide, tourists carrying one such device. The device would be multimedia capable and provide all the background information these guides provide!
Tuesday, January 16, 2007
ADT Still Has Mileage Left
Despite newer product being introduced and being actively promoted by a company, the product due to be superceded quite often shows surprising strength. Though Autodesk is actively promoting Revit, the architectural desktop( ADT) does not show any signs of becoming obsolete yet. In fact there seems to be active 3rd party support that adds to the products vitality.
H E Goldberg talked about two such resources in the Cadalyst issue. One of the advantages of ADT pointed out by Goldberg is an open API and VisionREZ is cited as a shining example of what could be done with ADT still! VisionREZ is a customized solution of ADT for the residential market and makes many design issues much simpler than it is in the original product. AmeriCAD has other resources for customer training and provides a set of services based on ADT.
ARCHIdigm is an example of useful resources. The author cites them for very useful training material for using ADT effectively.
I have been associated with both products. Couple of years back I was part of a set up that used to produce localized versions of software from Autodesk. I never really got very familiar with usage of the product. In my current avatar, I am into Revit quite heavily. I already see some workflow compatibility issues because Revit and AutoCAD did not come out of the same roots historically. I guess that would not be much of an issue with ADT as it's built on top of AutoCAD. Need to find out more.
H E Goldberg talked about two such resources in the Cadalyst issue. One of the advantages of ADT pointed out by Goldberg is an open API and VisionREZ is cited as a shining example of what could be done with ADT still! VisionREZ is a customized solution of ADT for the residential market and makes many design issues much simpler than it is in the original product. AmeriCAD has other resources for customer training and provides a set of services based on ADT.
ARCHIdigm is an example of useful resources. The author cites them for very useful training material for using ADT effectively.
I have been associated with both products. Couple of years back I was part of a set up that used to produce localized versions of software from Autodesk. I never really got very familiar with usage of the product. In my current avatar, I am into Revit quite heavily. I already see some workflow compatibility issues because Revit and AutoCAD did not come out of the same roots historically. I guess that would not be much of an issue with ADT as it's built on top of AutoCAD. Need to find out more.
Wednesday, January 10, 2007
SCRUM Works For BBC
I have been looking for evidence to see effective these "agile" methods of software development really are. What the specific religionists say, though honest, are always biased a little in favor of the religion/process they are promoting. I do feel these methods ought to work wonders. The waterfall or any such document/process heavy schemes are not workable they way they are required to.
Here comes a case study that show how BBC has used SCRUM to their great advantage. The story was presented at JAOO. A premier European developer conference on software technology, methods and best practices. The conference presents in-depth presentations and tutorials by researchers, engineers and trend-setters in software engineering and technology.
Andrew Scotland, the presenter, is Head of Development within the BBC's New Media Division. He is a certified SCRUM master and has successfully introduced SCRUM practice into the New Media division's multidisciplinary development teams (Software Engineering, User Experience, Information Architecture, Editorial, Product Management and Project Management).
The Author tells how BBC's New Media division, characterized by a lot of uncertainty and emergent software process, decided to use Scrum to more effectively deliver software amidst all that change and uncertainty. Three years later - the difference is significant, and the journey was worthwhile.
The environment is ever-changing thus it's really good to see Scrum succeeding in a situation it is supposed to work best. I'll need to look around for more such stories.
Here comes a case study that show how BBC has used SCRUM to their great advantage. The story was presented at JAOO. A premier European developer conference on software technology, methods and best practices. The conference presents in-depth presentations and tutorials by researchers, engineers and trend-setters in software engineering and technology.
Andrew Scotland, the presenter, is Head of Development within the BBC's New Media Division. He is a certified SCRUM master and has successfully introduced SCRUM practice into the New Media division's multidisciplinary development teams (Software Engineering, User Experience, Information Architecture, Editorial, Product Management and Project Management).
The Author tells how BBC's New Media division, characterized by a lot of uncertainty and emergent software process, decided to use Scrum to more effectively deliver software amidst all that change and uncertainty. Three years later - the difference is significant, and the journey was worthwhile.
The environment is ever-changing thus it's really good to see Scrum succeeding in a situation it is supposed to work best. I'll need to look around for more such stories.
Subscribe to:
Posts (Atom)