Hi-Tech Ethos
I was chatting with one of the PARC colleagues about one of the projects that I am working on and he made the observation "nothing is a success until it has been canceled three times."
I think this captures the spirit of Silicon Valley as well as any that I have heard.
Let a thousand ideas bloom
In the old days, the rule of thumb about getting a new idea funded by a VC works something like this.
For every 10 ideas, 1 turns into a business plan
For every 10 business plan, 1 turns into a VC conversation
For every 10 VC conversations, 1 gets funded.
In other words, an idea has a 1 in 1,000 chance of getting funded. And, this is makes no prediction on if a funded idea has even a 50-50 chance of surviving beyond the first two years.
1 in 3 vs. 1 in 1,000
Given the 1 in 1,000 context, having something canceled/rejected three times is really not that big of a deal.
On the other hand, if you feel light-headed after reading this statement, you should not be in Silicon Valley.
---
I think the real question is how does PARC continue its longevity and relevance. Maybe I can convince somebody in the social science group to do an analysis.
===
P@P
Monday, May 18, 2009
Wednesday, May 6, 2009
Earth Day 2009
Subject: PARC Earth Day 2009!
PARC’s Earth Day Celebration this year is going to be held on Wednesday, May 6th.
Our theme this year is: “Sustainable Smarts”. During the event you will be able to:
• Drop off personal e-waste from home. Leave it in the large cardboard bin marked “e-waste”.
• Drop off expired or unwanted medicine for proper disposal—At the P.A Water Quality Dept. booth
• Exchange your old mercury thermometer for a new electronic one for free!
• Pick up a plant, vegetable or flower at our Adopt-A-Plant booth
• Enter a drawing to win sustainable and eco-friendly prizes
• Shop organic produce at the Farmer Stand
• Check out the electric car
• Take a ride on the scooters
• Eat food cooked via solar power
My contributions to PARC earth Day 2009
Dropped off
• CPU
• Latptop
• PDA
• Cell phones
Got a sweet basil plant.
===
P@P
PARC’s Earth Day Celebration this year is going to be held on Wednesday, May 6th.
Our theme this year is: “Sustainable Smarts”. During the event you will be able to:
• Drop off personal e-waste from home. Leave it in the large cardboard bin marked “e-waste”.
• Drop off expired or unwanted medicine for proper disposal—At the P.A Water Quality Dept. booth
• Exchange your old mercury thermometer for a new electronic one for free!
• Pick up a plant, vegetable or flower at our Adopt-A-Plant booth
• Enter a drawing to win sustainable and eco-friendly prizes
• Shop organic produce at the Farmer Stand
• Check out the electric car
• Take a ride on the scooters
• Eat food cooked via solar power
My contributions to PARC earth Day 2009
Dropped off
• CPU
• Latptop
• PDA
• Cell phones
Got a sweet basil plant.
===
P@P
Thursday, April 30, 2009
Can 802.11 cut the HD TV cords?
802.11/WiFi for HDTV
For those of us in the know, 802.11 is the de facto wireless standard for computing devices that comes with all wireless PC/laptop and is increasingly available with mobile handsets and gaming console.
So the logical question is can 802.11 become the de facto wireless standard for all electronic devices. Say, instead of wiring up my new HDTV with all the players and other devices, would something like 802.11 help cut the cords?
This is an issue that a lot of industry players are grappling with and the latest verdict, for whatever it is worth, is yes according to this analyst report "802.11n Wi-Fi Technology is the Spoiler at the Wireless HD Video Party; Will Dominate (http://in-stat.com/press.asp?ID=2513&sku=IN0904455MI)"
Reality Check
Instead of arguing over the technical details, we conducted an analysis on how 802.11n will perform in typical home scenarios. Think of it as simulations on how 802.11n will work in a perfect environment - so your performance at home will only be worse...
Assumptions
It turns out that working out the specific variable/assumptions are the most tricky part. So, this is what we came up with. We want a home environment where there are different type of wireless traffics: some web surfing, some VoIP chatter, some TV watching, and some Blu-ray video watching.
We also looked into the distances amongst all the devices because interference is a very real issue with 802.11. As things are further apart, there is less inference and vice versa.
So, here are the variables that we tried in our simulations.
Devices:
Blu-ray video
HDTV
Media center (MC) - provides Blu-ray and HD TV to, up to, two (2) TV's
Web session
VoIP call
Access Point (AP) - support the web and VoIP sessions
Distances:
Short - 3m/10ft
Medium - 10m/30ft
Long - 15m/50ft
Scenarios
Without being exhaustive, scenarios range from:
Most “forgiving”: 1 MC + 1 HDTV + 1AP + 1 VoIP + 1 Web at long distance (15m/50ft)
to
Least “forgiving”: 1 MC + 1 Blu-Ray + 1 Blu-Ray+ 1AP + 1 VoIP + 1 Web at short distance (3m/10ft)
Results
The performance threshold that we used was less than 200msec delay. If it is more than that, it becomes noticeable to human perception.
The long and short of it is that 802.11n only works within the acceptable range in the case of one HDTV far away from other devices, i.e. the most "forgiving" scenario. On top of it, all the other users such as web surfing and VoIP would have to accept significant performance hit.
--
I would love to hear your experiences using 802.11 in a mixed-media environment with multiple users. Maybe I am missing something here.
===
P@P
For those of us in the know, 802.11 is the de facto wireless standard for computing devices that comes with all wireless PC/laptop and is increasingly available with mobile handsets and gaming console.
So the logical question is can 802.11 become the de facto wireless standard for all electronic devices. Say, instead of wiring up my new HDTV with all the players and other devices, would something like 802.11 help cut the cords?
This is an issue that a lot of industry players are grappling with and the latest verdict, for whatever it is worth, is yes according to this analyst report "802.11n Wi-Fi Technology is the Spoiler at the Wireless HD Video Party; Will Dominate (http://in-stat.com/press.asp?ID=2513&sku=IN0904455MI)"
Reality Check
Instead of arguing over the technical details, we conducted an analysis on how 802.11n will perform in typical home scenarios. Think of it as simulations on how 802.11n will work in a perfect environment - so your performance at home will only be worse...
Assumptions
It turns out that working out the specific variable/assumptions are the most tricky part. So, this is what we came up with. We want a home environment where there are different type of wireless traffics: some web surfing, some VoIP chatter, some TV watching, and some Blu-ray video watching.
We also looked into the distances amongst all the devices because interference is a very real issue with 802.11. As things are further apart, there is less inference and vice versa.
So, here are the variables that we tried in our simulations.
Devices:
Blu-ray video
HDTV
Media center (MC) - provides Blu-ray and HD TV to, up to, two (2) TV's
Web session
VoIP call
Access Point (AP) - support the web and VoIP sessions
Distances:
Short - 3m/10ft
Medium - 10m/30ft
Long - 15m/50ft
Scenarios
Without being exhaustive, scenarios range from:
Most “forgiving”: 1 MC + 1 HDTV + 1AP + 1 VoIP + 1 Web at long distance (15m/50ft)
Least “forgiving”: 1 MC + 1 Blu-Ray + 1 Blu-Ray+ 1AP + 1 VoIP + 1 Web at short distance (3m/10ft)
Results
The performance threshold that we used was less than 200msec delay. If it is more than that, it becomes noticeable to human perception.
The long and short of it is that 802.11n only works within the acceptable range in the case of one HDTV far away from other devices, i.e. the most "forgiving" scenario. On top of it, all the other users such as web surfing and VoIP would have to accept significant performance hit.
--
I would love to hear your experiences using 802.11 in a mixed-media environment with multiple users. Maybe I am missing something here.
===
P@P
Labels:
802.11,
ad hoc networking,
CAPE,
networking,
Standards
Wednesday, April 22, 2009
New Models for Increasing Innovation
PARC on Innovation - an eBroadcast
From Concept to Commerce: New Models for Increasing Innovation
Thursday, May 14th, 2009 at 2pm EST/11am PST
http://www.frost.com/prod/servlet/ebroadcast.pag?eventid=164160027
Learn:
* Why and when to partner with an external Innovation services company
* How to leverage an external Innovation Partner to create new lines of revenue
* What qualities to look for in an Innovation Partner to meet your diverse needs
Perspectives from a Senior Emerging Technology Analyst, a successful corporate Innovator, and a leading Innovation Partner on how to move from entrenched product offerings and market applications to new growth-based innovation models.
===
P@P
From Concept to Commerce: New Models for Increasing Innovation
Thursday, May 14th, 2009 at 2pm EST/11am PST
http://www.frost.com/prod/servlet/ebroadcast.pag?eventid=164160027
Learn:
* Why and when to partner with an external Innovation services company
* How to leverage an external Innovation Partner to create new lines of revenue
* What qualities to look for in an Innovation Partner to meet your diverse needs
Perspectives from a Senior Emerging Technology Analyst, a successful corporate Innovator, and a leading Innovation Partner on how to move from entrenched product offerings and market applications to new growth-based innovation models.
===
P@P
Friday, April 3, 2009
Dueling dinasaurs - the Cloud edition
Dueling Dinosaurs
A few months back while talking about setting networking technical standards, one of the networking statesmen told me Dave Clark's dueling dinosaurs story as a metaphor on timing as a critical factor. In this version, the best time to set the standard is when the core technical requirements have been worked out but the commercial interests have not yet been deeply entrenched.

In other words, if the core requirements have not yet been worked out, the standard is liable to be in poor quality which impedes its proliferation. On the other hand, when there is significantly commercial entrenchment before a given standard is set, there is every business incentive to bias the standard which will fracture the industry.
Clash of the cloud dinosaurs
Just read the economist article about how two interests groups are fighting over Cloud Computing standards. In short, one group who already have a meaningful footprint in the Cloud space are happy with what they do. The opposing group are proposing standards on interoperability that allows users to easily switch between services.
--
It is a comfort to know that even in the ever changing world of technology, some things don't.
===
P@P
A few months back while talking about setting networking technical standards, one of the networking statesmen told me Dave Clark's dueling dinosaurs story as a metaphor on timing as a critical factor. In this version, the best time to set the standard is when the core technical requirements have been worked out but the commercial interests have not yet been deeply entrenched.
In other words, if the core requirements have not yet been worked out, the standard is liable to be in poor quality which impedes its proliferation. On the other hand, when there is significantly commercial entrenchment before a given standard is set, there is every business incentive to bias the standard which will fracture the industry.
Clash of the cloud dinosaurs
Just read the economist article about how two interests groups are fighting over Cloud Computing standards. In short, one group who already have a meaningful footprint in the Cloud space are happy with what they do. The opposing group are proposing standards on interoperability that allows users to easily switch between services.
--
It is a comfort to know that even in the ever changing world of technology, some things don't.
===
P@P
Labels:
Cloud Computing,
networking,
Standards,
tech adoption
Monday, March 9, 2009
Relational Software's Pitch
An archeological find
I was doing some research on Oracle. Specifically, I was trying to understand how the world transitioned from hierarchical database to relational database. It turned out that circa 1983 Oracle was handing out their stories when their name changed from its 1977 name as Relational Software to Oracle.
This is an archeological find on the earlier days of commercial relational database.
How did Relational/Oracle do it?
According to this 1983 document, the value proposition for relational database is that it can be manipulated by non-technical users. So, instead of waiting days on database administrators and programmers to produce the information, an user can construct a query and get the data in a matter of minutes/hours.
Ultimately, however, the key economic/business driver is that it is easier for corporations to build up huge amount of data than to hire and train database specialists. In this context, relational database and the SQL language make the data much more valuable to the business operators.
Looking for the fundamental economic shift
With perfect hindsight, Oracle was clearly right.
At a deeper level, however, relational database fundamentally changed the economics of database from a high-end specialty tool to something that is a common utility in almost all aspects of our digital life today.
The more relevant question is then, what technology is fundamentally changing the economics of how we do things today?
===
P@P
I was doing some research on Oracle. Specifically, I was trying to understand how the world transitioned from hierarchical database to relational database. It turned out that circa 1983 Oracle was handing out their stories when their name changed from its 1977 name as Relational Software to Oracle.
This is an archeological find on the earlier days of commercial relational database.
How did Relational/Oracle do it?
According to this 1983 document, the value proposition for relational database is that it can be manipulated by non-technical users. So, instead of waiting days on database administrators and programmers to produce the information, an user can construct a query and get the data in a matter of minutes/hours.
Ultimately, however, the key economic/business driver is that it is easier for corporations to build up huge amount of data than to hire and train database specialists. In this context, relational database and the SQL language make the data much more valuable to the business operators.
Looking for the fundamental economic shift
With perfect hindsight, Oracle was clearly right.
At a deeper level, however, relational database fundamentally changed the economics of database from a high-end specialty tool to something that is a common utility in almost all aspects of our digital life today.
The more relevant question is then, what technology is fundamentally changing the economics of how we do things today?
===
P@P
Thursday, March 5, 2009
How to catch a spy
Source: Plame vs. Whitehouse
For those of you who do not remember, Valerie Plame was working for CIA as an undercover agent and the Whitehouse leaked her CIA identify in 2003. With her cover identify blown, she left CIA in 2005.
In 2007, she published a memoir "Fair Game: My Life as a Spy, My Betrayal by the White House". CIA intervened and redacted (blacked out) "sensitive" information in the published book.

A page of the redacted Fair Game
How to catch a spy, the PARC way
A PARC team has developed a machine learning engine that is able to use contextual information that may not be sensitive by itself but in aggregate provides strong inference on what the missing information should be.
The Plame book is a perfect test case because, although the book has been redacted, the actual information is available in other public sources. In other words, we can run the book through the engine and see what kind of inference the engine can tell us and check it against the known answers.
Test case: where was her first assignment?
So, we fed the available and seemingly innocuous description on the location (redacted) of her first assignment such as "Europe, chaotic, outdoor café, traffic, summer heat" into the software.
Lo and behold, the engine comes back with Greece as the most probable answer which was indeed the case.
--
How would you use this software engine beyond figuring out if your censors are good enough? Conversely, how would you use the output of this engine? How about removing sensitive medical information in unstructured format? Or, finding that smoking gun in the mountain of data and emails in a legal case? This is an instance where tireless software with perfect memory to a large corpus of information is a better solution than the best trained/paid human attention any day.
Let me know how you would use this capability. For the most interesting idea(s), maybe I can get you a copy of the software engine to play with.
Look forward to hearing from you.
===
P@P
For those of you who do not remember, Valerie Plame was working for CIA as an undercover agent and the Whitehouse leaked her CIA identify in 2003. With her cover identify blown, she left CIA in 2005.
In 2007, she published a memoir "Fair Game: My Life as a Spy, My Betrayal by the White House". CIA intervened and redacted (blacked out) "sensitive" information in the published book.
How to catch a spy, the PARC way
A PARC team has developed a machine learning engine that is able to use contextual information that may not be sensitive by itself but in aggregate provides strong inference on what the missing information should be.
The Plame book is a perfect test case because, although the book has been redacted, the actual information is available in other public sources. In other words, we can run the book through the engine and see what kind of inference the engine can tell us and check it against the known answers.
Test case: where was her first assignment?
So, we fed the available and seemingly innocuous description on the location (redacted) of her first assignment such as "Europe, chaotic, outdoor café, traffic, summer heat" into the software.
Lo and behold, the engine comes back with Greece as the most probable answer which was indeed the case.
--
How would you use this software engine beyond figuring out if your censors are good enough? Conversely, how would you use the output of this engine? How about removing sensitive medical information in unstructured format? Or, finding that smoking gun in the mountain of data and emails in a legal case? This is an instance where tireless software with perfect memory to a large corpus of information is a better solution than the best trained/paid human attention any day.
Let me know how you would use this capability. For the most interesting idea(s), maybe I can get you a copy of the software engine to play with.
Look forward to hearing from you.
===
P@P
Subscribe to:
Posts (Atom)