Marten and Open Source Hook
I was chatting with Marten Mickos recently. The question of what makes an open source effort succeed when others just flounder came up. He suggested that having a specific hook/niche that can be easily articulated is an important factor amongst many others. In the case of MySQL, it was a database designed specifically for web usage.
In the mundane business talk, it is about having an unique value proposition.
CCN and its Open Source
CCN's open source release came out last year. So, Marten's observation got me thinking about what is CCN's hook.
According to the recent Network World article, it is about security and multimedia/content consumption.
Would be interested in your take on what CCN's unique value proposition is. I am all ears.
===
P@P
Showing posts with label networking. Show all posts
Showing posts with label networking. Show all posts
Friday, January 8, 2010
Tuesday, January 5, 2010
CCN and internet at 2020
2020 Vision: Why you won't recognize the 'Net in 10 years
Here is an interesting article that talks about various efforts currently under way to tackle existing problems with internet today. Naturally, PARC's content centric networking (CCN) was mentioned as one of the efforts.
Content-Centric Networking
The article focused on the security implication of CCN although I would argue that it is just one of the reasons that CCN is a compelling solution. For example, by focusing on content instead of IP, CCN opens up new ways of consuming digital content that is not possible today.
The best way to find out more is to watch this video presentation by Van on CCN. And, for those adventurous and technically-inclined enough, you may also want to try out the CCN open source release.
===
P@P
Here is an interesting article that talks about various efforts currently under way to tackle existing problems with internet today. Naturally, PARC's content centric networking (CCN) was mentioned as one of the efforts.
Content-Centric Networking
The article focused on the security implication of CCN although I would argue that it is just one of the reasons that CCN is a compelling solution. For example, by focusing on content instead of IP, CCN opens up new ways of consuming digital content that is not possible today.
The best way to find out more is to watch this video presentation by Van on CCN. And, for those adventurous and technically-inclined enough, you may also want to try out the CCN open source release.
===
P@P
Monday, October 5, 2009
CCN open source release at CCNX.ORG
A new approach to networking
I have made references to the CCN project that a team at PARC is working on. Very glad to report that the first open source release has gone live at www.ccnx.org
PARC Newsletter announcement
PARC just released an early version of open source infrastructure software and protocol specifications for our "Content-Centric Networking (CCN)" architecture. Our goal is to enable experimentation in the network research community and establish a foundation of open core protocols for content networking. We are also beginning to work with clients to explore new business solutions enabled by the CCN approach.
===
P@P
I have made references to the CCN project that a team at PARC is working on. Very glad to report that the first open source release has gone live at www.ccnx.org
PARC Newsletter announcement
PARC just released an early version of open source infrastructure software and protocol specifications for our "Content-Centric Networking (CCN)" architecture. Our goal is to enable experimentation in the network research community and establish a foundation of open core protocols for content networking. We are also beginning to work with clients to explore new business solutions enabled by the CCN approach.
===
P@P
Tuesday, September 29, 2009
Digital Distribution as a Competitive Advantage for Media Companies
The trouble with the digital world
The issues confronting many media companies in the world of internet, in general, and news aggregation, in particularly, is well known by now. The traditional business model of setting up the infrastructure to serve a particular geographic region is no longer seem as a compelling advantage.
Physical distribution of digital content as an advantage
Just thinking out loud. At the most basic level, while internet is considered "virtual", there is still a physical/geographic limitation on the actual routing of bits and byte. For example, by providing coverage outside of homes and offices in a region, this would deter others from setting up parallel infrastructure. Furthermore, by linking into regional advertising base, a media company can provide very fine-grained data for advertisers which would be worth a lot more than just general IP based information.
In other words, could this be a form of digital distribution that is highly defensible?
===
P@P
The issues confronting many media companies in the world of internet, in general, and news aggregation, in particularly, is well known by now. The traditional business model of setting up the infrastructure to serve a particular geographic region is no longer seem as a compelling advantage.
Physical distribution of digital content as an advantage
Just thinking out loud. At the most basic level, while internet is considered "virtual", there is still a physical/geographic limitation on the actual routing of bits and byte. For example, by providing coverage outside of homes and offices in a region, this would deter others from setting up parallel infrastructure. Furthermore, by linking into regional advertising base, a media company can provide very fine-grained data for advertisers which would be worth a lot more than just general IP based information.
In other words, could this be a form of digital distribution that is highly defensible?
===
P@P
Thursday, April 30, 2009
Can 802.11 cut the HD TV cords?
802.11/WiFi for HDTV
For those of us in the know, 802.11 is the de facto wireless standard for computing devices that comes with all wireless PC/laptop and is increasingly available with mobile handsets and gaming console.
So the logical question is can 802.11 become the de facto wireless standard for all electronic devices. Say, instead of wiring up my new HDTV with all the players and other devices, would something like 802.11 help cut the cords?
This is an issue that a lot of industry players are grappling with and the latest verdict, for whatever it is worth, is yes according to this analyst report "802.11n Wi-Fi Technology is the Spoiler at the Wireless HD Video Party; Will Dominate (http://in-stat.com/press.asp?ID=2513&sku=IN0904455MI)"
Reality Check
Instead of arguing over the technical details, we conducted an analysis on how 802.11n will perform in typical home scenarios. Think of it as simulations on how 802.11n will work in a perfect environment - so your performance at home will only be worse...
Assumptions
It turns out that working out the specific variable/assumptions are the most tricky part. So, this is what we came up with. We want a home environment where there are different type of wireless traffics: some web surfing, some VoIP chatter, some TV watching, and some Blu-ray video watching.
We also looked into the distances amongst all the devices because interference is a very real issue with 802.11. As things are further apart, there is less inference and vice versa.
So, here are the variables that we tried in our simulations.
Devices:
Blu-ray video
HDTV
Media center (MC) - provides Blu-ray and HD TV to, up to, two (2) TV's
Web session
VoIP call
Access Point (AP) - support the web and VoIP sessions
Distances:
Short - 3m/10ft
Medium - 10m/30ft
Long - 15m/50ft
Scenarios
Without being exhaustive, scenarios range from:
Most “forgiving”: 1 MC + 1 HDTV + 1AP + 1 VoIP + 1 Web at long distance (15m/50ft)
to
Least “forgiving”: 1 MC + 1 Blu-Ray + 1 Blu-Ray+ 1AP + 1 VoIP + 1 Web at short distance (3m/10ft)
Results
The performance threshold that we used was less than 200msec delay. If it is more than that, it becomes noticeable to human perception.
The long and short of it is that 802.11n only works within the acceptable range in the case of one HDTV far away from other devices, i.e. the most "forgiving" scenario. On top of it, all the other users such as web surfing and VoIP would have to accept significant performance hit.
--
I would love to hear your experiences using 802.11 in a mixed-media environment with multiple users. Maybe I am missing something here.
===
P@P
For those of us in the know, 802.11 is the de facto wireless standard for computing devices that comes with all wireless PC/laptop and is increasingly available with mobile handsets and gaming console.
So the logical question is can 802.11 become the de facto wireless standard for all electronic devices. Say, instead of wiring up my new HDTV with all the players and other devices, would something like 802.11 help cut the cords?
This is an issue that a lot of industry players are grappling with and the latest verdict, for whatever it is worth, is yes according to this analyst report "802.11n Wi-Fi Technology is the Spoiler at the Wireless HD Video Party; Will Dominate (http://in-stat.com/press.asp?ID=2513&sku=IN0904455MI)"
Reality Check
Instead of arguing over the technical details, we conducted an analysis on how 802.11n will perform in typical home scenarios. Think of it as simulations on how 802.11n will work in a perfect environment - so your performance at home will only be worse...
Assumptions
It turns out that working out the specific variable/assumptions are the most tricky part. So, this is what we came up with. We want a home environment where there are different type of wireless traffics: some web surfing, some VoIP chatter, some TV watching, and some Blu-ray video watching.
We also looked into the distances amongst all the devices because interference is a very real issue with 802.11. As things are further apart, there is less inference and vice versa.
So, here are the variables that we tried in our simulations.
Devices:
Blu-ray video
HDTV
Media center (MC) - provides Blu-ray and HD TV to, up to, two (2) TV's
Web session
VoIP call
Access Point (AP) - support the web and VoIP sessions
Distances:
Short - 3m/10ft
Medium - 10m/30ft
Long - 15m/50ft
Scenarios
Without being exhaustive, scenarios range from:
Most “forgiving”: 1 MC + 1 HDTV + 1AP + 1 VoIP + 1 Web at long distance (15m/50ft)
Least “forgiving”: 1 MC + 1 Blu-Ray + 1 Blu-Ray+ 1AP + 1 VoIP + 1 Web at short distance (3m/10ft)
Results
The performance threshold that we used was less than 200msec delay. If it is more than that, it becomes noticeable to human perception.
The long and short of it is that 802.11n only works within the acceptable range in the case of one HDTV far away from other devices, i.e. the most "forgiving" scenario. On top of it, all the other users such as web surfing and VoIP would have to accept significant performance hit.
--
I would love to hear your experiences using 802.11 in a mixed-media environment with multiple users. Maybe I am missing something here.
===
P@P
Labels:
802.11,
ad hoc networking,
CAPE,
networking,
Standards
Friday, April 3, 2009
Dueling dinasaurs - the Cloud edition
Dueling Dinosaurs
A few months back while talking about setting networking technical standards, one of the networking statesmen told me Dave Clark's dueling dinosaurs story as a metaphor on timing as a critical factor. In this version, the best time to set the standard is when the core technical requirements have been worked out but the commercial interests have not yet been deeply entrenched.

In other words, if the core requirements have not yet been worked out, the standard is liable to be in poor quality which impedes its proliferation. On the other hand, when there is significantly commercial entrenchment before a given standard is set, there is every business incentive to bias the standard which will fracture the industry.
Clash of the cloud dinosaurs
Just read the economist article about how two interests groups are fighting over Cloud Computing standards. In short, one group who already have a meaningful footprint in the Cloud space are happy with what they do. The opposing group are proposing standards on interoperability that allows users to easily switch between services.
--
It is a comfort to know that even in the ever changing world of technology, some things don't.
===
P@P
A few months back while talking about setting networking technical standards, one of the networking statesmen told me Dave Clark's dueling dinosaurs story as a metaphor on timing as a critical factor. In this version, the best time to set the standard is when the core technical requirements have been worked out but the commercial interests have not yet been deeply entrenched.
In other words, if the core requirements have not yet been worked out, the standard is liable to be in poor quality which impedes its proliferation. On the other hand, when there is significantly commercial entrenchment before a given standard is set, there is every business incentive to bias the standard which will fracture the industry.
Clash of the cloud dinosaurs
Just read the economist article about how two interests groups are fighting over Cloud Computing standards. In short, one group who already have a meaningful footprint in the Cloud space are happy with what they do. The opposing group are proposing standards on interoperability that allows users to easily switch between services.
--
It is a comfort to know that even in the ever changing world of technology, some things don't.
===
P@P
Labels:
Cloud Computing,
networking,
Standards,
tech adoption
Wednesday, February 18, 2009
Third fire alarm
Fire! Again!
What's up with this! A third accidental fire alarm in as many months. Seems like we are on a roll. Or, there is a minor conspiracy that I am not aware of.
While waiting in the emergency area, I did have a chance to strike up a conversation about PARC during the 1989 Loma Prieta earthquake, a.k.a. the World Series Quake. Back then, people were using car radios to get updates on the status.
If it were to happen today, I wonder if the cell phone towers would hold up during the quake and if the towers would be overwhelmed with calls. Maybe a distributed and ad-hoc networking backbone would be a more robust solution, although I think people used WiMax for the 2004 Tsunami in the Indian Ocean.
Yeah, back to work.
==
P@P
What's up with this! A third accidental fire alarm in as many months. Seems like we are on a roll. Or, there is a minor conspiracy that I am not aware of.
While waiting in the emergency area, I did have a chance to strike up a conversation about PARC during the 1989 Loma Prieta earthquake, a.k.a. the World Series Quake. Back then, people were using car radios to get updates on the status.
If it were to happen today, I wonder if the cell phone towers would hold up during the quake and if the towers would be overwhelmed with calls. Maybe a distributed and ad-hoc networking backbone would be a more robust solution, although I think people used WiMax for the 2004 Tsunami in the Indian Ocean.
Yeah, back to work.
==
P@P
Friday, December 19, 2008
Context Aware Protocol Engines (CAPE)
Ad Hoc Networking at PARC
As the birth place of Ethernet, PARC has maintained a presence in networking technology over the years. And, one of the areas that PARC is working on explores the ad hoc networking technology.
CAPE (Context Aware Protocol Engines) came out of several lines of work related to wireless ad hoc networking. A major early driver/consumer of this type of technology is DARPA with military application in the battle field where the operating environment is highly dynamic and ability to effectively communicate commission critical.
However, like most technology, there are many civilian commercial applications that may be worth exploring.
CAPE
Without going into the details, CAPE sets up a wireless network that is self-configuring, scalable, high-utilization, and high-quality. Specifically, here are some of the advantages that we have been able to demonstrate vis-à-vis conventional solutions such as 802.11.
Higher throughput: conventional solution has a fair amount of traffic management overhead whereas CAPE was designed to minimize the overhead required. In practice, this means that for a given bandwidth (pipe size), CAPE can pack a lot more traffic into it.
Better Quality of Service: looking beyond throughput, a major network issue for the end-user is jitters and delays for real-time streams such as video VoIP calls. While the payload may not be large in absolute terms, the sequence and speed of delivery are vital to its smooth function. Again, this is an area convention solutions often fail and CAPE can maintain a high QoS level.
Multi-Hop: for those of you who have to deal with multi-hop ad hoc networking issue, you know how big of a challenge this is. Suffices to say that CAPE allows for quite a few hops without requiring special antenna or extensive topological engineering.
Self-configuring: since it is not reasonable to expect a soldier to spend 5 minutes setting up network access in the heat of battle, CAPE allows each node in the network to self-configure as the node comes into the network. In the more mundane world, where I live, this means that when mom buys a new laptop, I do not have to configure it for her!
What can you do with CAPE?
I am looking into application areas for CAPE. I would love to hear your ideas and suggestions.
===
P@P
As the birth place of Ethernet, PARC has maintained a presence in networking technology over the years. And, one of the areas that PARC is working on explores the ad hoc networking technology.
CAPE (Context Aware Protocol Engines) came out of several lines of work related to wireless ad hoc networking. A major early driver/consumer of this type of technology is DARPA with military application in the battle field where the operating environment is highly dynamic and ability to effectively communicate commission critical.
However, like most technology, there are many civilian commercial applications that may be worth exploring.
CAPE
Without going into the details, CAPE sets up a wireless network that is self-configuring, scalable, high-utilization, and high-quality. Specifically, here are some of the advantages that we have been able to demonstrate vis-à-vis conventional solutions such as 802.11.
Higher throughput: conventional solution has a fair amount of traffic management overhead whereas CAPE was designed to minimize the overhead required. In practice, this means that for a given bandwidth (pipe size), CAPE can pack a lot more traffic into it.
Better Quality of Service: looking beyond throughput, a major network issue for the end-user is jitters and delays for real-time streams such as video VoIP calls. While the payload may not be large in absolute terms, the sequence and speed of delivery are vital to its smooth function. Again, this is an area convention solutions often fail and CAPE can maintain a high QoS level.
Multi-Hop: for those of you who have to deal with multi-hop ad hoc networking issue, you know how big of a challenge this is. Suffices to say that CAPE allows for quite a few hops without requiring special antenna or extensive topological engineering.
Self-configuring: since it is not reasonable to expect a soldier to spend 5 minutes setting up network access in the heat of battle, CAPE allows each node in the network to self-configure as the node comes into the network. In the more mundane world, where I live, this means that when mom buys a new laptop, I do not have to configure it for her!
What can you do with CAPE?
I am looking into application areas for CAPE. I would love to hear your ideas and suggestions.
===
P@P
Labels:
802.11,
ad hoc networking,
CAPE,
networking
Subscribe to:
Posts (Atom)