顯示具有 雲端產業 標籤的文章。 顯示所有文章
顯示具有 雲端產業 標籤的文章。 顯示所有文章

2014年7月5日 星期六

Too Big To Know:David Weinberger 重新定義知識的意義與力量 - 延伸心得 ( Too Big To Know, Unsettling Knowledge )

    當知識已經「網路化」。網路思想先驅溫柏格於書中清楚說明,在商業、科學、教育和政府領域裡,網路化的知識究竟如何影響我們的思考與決策。當知識網路化,變得豐饒、開放、相互連結,我們對「知識是什麼」、「知識如何運作」的既存看法就受到強烈挑戰;如何面對挑戰、建立知識的新架構,是網路時代知識工作者的關鍵課題。

    作者溫柏格是哲學博士、資訊科技評論家,也是首屈一指的網路思想先驅。他也是哈佛大學貝克曼網路與社會研究中心資深研究員,對數位世代下,網際網路如何影響社會、人際關係與溝通等觀察入微、見解獨到。曾擔任美國總統候選人的網路政策顧問。經常在美國公共廣播電台評論時代脈動,並撰寫無數與資訊、網路、行銷相關的文章,長期為《連線》、《今日美國報》、《史密松寧通訊》、《哈佛商業評論》、《科學人》、《紐約時報》等刊物執筆。他也擔任眾多高科技公司的行銷顧問,《華爾街日報》甚至封他為「行銷大師」。

   本書輕鬆地說明了一個非常重要的概念:知識已經不再被紙本書籍的形式所限制與保存,網路與連結才是知識無窮盡的聖殿。看完這本書你可能不會變聰明,但你一定會知道如何讓你每天使用的網際網路變得更聰明

網路的出現,特別是近年來社群網站的流行,讓各種真偽難辨的訊息四處流散,也讓我們不得不思考知識的傳統定義是否仍然有效。所幸我們看到網路上仍有許多人透過各種方式試圖幫助網友過濾訊息,各種各樣的實驗手段轉化成一個又一個的創新與商機。這也是身為網路從業人員最令人興奮之處。幾乎所有人都知道知識就是力量,卻鮮少人留意這個力量重新分配的原因和結果。所有的革命,都源自於此。

知識不再被書籍的形式所限制時將使集體人群的經驗與思想借由網路產生整合與創新,知識也不再被專家獨自享有,聰明的網路式集體人群決策將發生,將大大影響人類生活、民主、人權、教育及科技,法蘭西斯•培根 : 『知識就是力量』也將演變成『知識、社群趨勢、媚力就是無比力量』,此力量之大遠超過我們想像,就像 Apple Inc., 之 iPhone 是運用科技知識、iTune 社群趨勢、高質感外觀及操作介面之媚力一舉打敗曾稱霸世界39年之手機王 Nokia 擊倒在地無法回生, Google 之 Android 更運用相似之科技知識、Google 社群及Open Source趨勢、好的質感外觀及操作介面之媚力一舉大幅提升觸控智慧型手機市場占有率至80%,網路、社群網站的出現,『知識就是力量』將改變了。



知識數位化及網路化 - 造成數位化網路知識打敗了傳統書成主流網路

  在書是知識未數位化的時代,百科全書業務基本上是一個分銷業務。大部分的費用是佣金的推銷員。網際網路出現後,新技術形成很多規模更便宜的訂單分佈之知識,百科全書行業崩潰。大英百科全書就被 Wikipedia 打敗的。

   維基百科全書,自2001年1月13日上線,2001年1月15日正式成立,由維基媒體基金會負責維持,截至2012年6月時,維基百科條目數第一的英語維基百科已有360萬條條目,而全球所有282種語言的獨立運作版本共突破1900萬條條目,總登記用戶也超越2960萬人,而總編輯次數更是超越10億次,而根據知名的 Alexa 網路流量統計排名,維基百科目前為世界網站流量排名第六大網站(頂峰時為第五大)與第一大無廣告網站。大部分頁面都可以由任何人使用瀏覽器進行閱覽和修改,而維基百科的普及也促成了其它計劃,例如:維基新聞、維基教科書等計劃的產生。雖然 Wiki 技術產生對這些所有人都可以編輯的內容準確性的爭議,但如果所列出的資料來源可以被查證、審查及確認,仍會受到各界一定的肯定。

BCG科技與策略趨勢分析:百科全書業務基本上是一個分銷業務。大部分的費用是佣金的推銷員。CD-ROM後,網際網路出現了,新技術形成很多規模更便宜的訂單分佈之知識,百科全書行業崩潰。當然現在這是一個很熟悉的故事。其實,這更普遍的說這是第一代互聯網經濟的故事。交易成本下降打破了價值鏈,因此允許跳過中介,或者我們稱之為通路之解構,明顯例子:誰取代大英百科全書讓不它再有商業模式?答案變得明顯。我們知道它是:維基百科。維基百科有什麼特別之處?是不是它的分佈。維基百科有什麼特別的生產方式。維基百科是由其用戶創建的百科全書。

知識「網路化」、專業知識在雲端,將推助雲端知識與虛擬雲端專家之形成

在網際網路領軍之下,知識現在變得社群化、行動化和公開化。溫柏格讓我們看到如何從中解密出好處來。在網際網路「資訊過載」的時代,過濾的策略不再是刪除,而是將所需資訊過濾到最前面。

知識社群化、行動化和公開化後,每一天,全球人類製造出來的資料量高達25億GB,要用4000萬台 64GB 的 iPad 才裝得下。過去,資料量太大、毫無價值的資料太多、「賭博」的代價太高,沒有一家企業有本錢用過去既有的流程一一過濾所有的資料,但新的方法和工具,將從中挖出值得鍛鑄與收藏的珍寶。經過整合、分析的海量資料,可以讓公司增加50%的新客戶,讓政府減少30%的成本。

這些海量資料分析如海潮般湧流的大量資料,知識「網路化」製造出來的大量資料量,經大量資料分析,正是雲端時代的新金脈,已經創造出驚人的效益,並將推助雲端知識與虛擬雲端專家系統之形成,雲端專家系統結合人工智慧將創造出新的應用。

知識「網路化」、書籍數位化,將造成新的電子書雲端知識與總整理商業服務

在網際網路領軍之下,從 Google Ngrame viewer 學到的事就是未來書籍數位化、網路化是否導致新的雲端知識與總整理商業服務:例如讀取100本股市分析電子書總整理之K線統計、讀取30本能源相關電子書總整理,這種加速人們讀取眾專家之精髓的閱讀方式將改變人類吸收知識、統計出知識、定義出知識之方法。


 

2014年6月2日 星期一

全球雲端平台戰況激烈 ( World Wide Cloud platform became very a competitive business )

Google雲端平台登台 與亞馬遜、微軟競爭

隨著網路應用更趨成熟,雲端服務已成企業提升競爭力的重要利器。Google 針對台灣企業用戶的需求,首度在台推出 Google 雲端平台(Google Cloud Platform),全面升級 Google 企業服務,為客戶帶來全球最快、最穩定、最佳品質的雲端後盾,也將與亞馬遜 AWS(Amazon Web Services)、微軟 Windows Azure 等服務競爭台灣市場。

‧Google 雲端平台:https://cloud.google.com/

谷歌雲端平台建置包括運算、儲存及應用服務3大層面,台灣是 Google 雲端平台登陸亞洲的第一站,首波服務包括 Google Compute Engine、Google Cloud Storage 以及 Google Cloud SQL,可讓企業用戶和開發者享用全球相同的雲端基礎架構,獲得更高的運算效能和低延遲率等優勢;也提供一系列工具與應用程式介面(API),讓用戶能在最具規模和最快速的雲端環境中設計應用程式,這也需借助 Google 在彰濱工業區和新加坡等地的資料中心進行協作與運算,後續如 Google App Engine、Google BigQuery 和 Google Datastore 等服務也將陸續在亞洲上線。

8 個月前台灣 Google 已開始推動 Google Enterprise 服務,如企業 Gmail、文件、雲端硬碟等常見的 Google Apps,還有 Marketplaces 以及 Developer platform。未來,企業可將焦點放在開發應用服務上,硬體或雲端架構則可交給 Google 雲端平台來進行。目前全球已有475萬個應用程式在 Google 雲端平台上執行,App Engine 每天可處理多達280億次請求,Cloud Datastore 每月進行6.3兆次儲存。

國外知名的照片聊天服務 Snapchat、「Angry Bird」手機遊戲商 Rovio、日本社群遊戲商 Applibot 已採用 Google 雲端平台。而台灣地理資訊中心(TGIC)採用其中的 Google 地球企業版,改進災害應變及土地規劃系統;交通部公路總局也透過 Google 地球,及早規劃防災應變方案。

Google 台灣董事總經理簡立峰,以及特地訪台的雲端平台營運總監 Daniel Powers,一同揭櫫 Google 雲端平台在台灣的經營方向發展願景。Daniel Powers 表示,無論是台灣的企業用戶或開發人員,今後都可享有 Google 投入15年、建置的全球雲端基礎架構所帶來的高速、擴充彈性和規模,以及本地化的工具與技術支援。Google 近日也將在台北、香港等亞洲城市舉辦一系列的雲端講座活動,與開發人員分享 Google 雲端平台將如何協助提升應用服務的開發效率。

Google雲端服務降價85%

雲端價格戰開打!《紐約時報》報導,Google雲端業務來勢洶洶,挾帶削價85%、軟體更新服務等優勢,直逼亞馬遜(Amazon.com)的霸主地位。Google的雲端服務將減價30%至85%,雲端儲存費用降為每GB 0.026美元,比原先價格約低68%;運算引擎(Compute Engine)則不分區域、大小、等級,費用也將便宜32%,BigQuery數據分析服務價格約減85%。Google期望,未來可以將雲端業務轉型為整合應用程式和數據的產品,而不僅僅是許多分散的功能。

從策略角度來看,Google大砍雲端服務價格是相當聰明的做法。亞馬遜首創雲端運算商機,2006年開始出租電腦運算能力和儲存空間給其他企業。知名品牌如Netflix、Shell等都透過亞馬遜的平台來經營事業,但卻要付出昂貴的費用,許多較小的網路商吞不下成本,只好轉往雲端系統。

Google除了和亞馬遜、微軟、Rackspace Hosting Inc.等公司競爭,也需要和新的供應商搶客戶。研調機構Gartner預估,雲端服務市場規模去年達1310億美元。

Google科技基礎設施資深副總裁赫爾茲(Urs Holzle) 25日表示,雲端伺服器和儲存系統成本大幅下滑,但是雲端運算費用卻未隨之持續調降,他們認為兩者之間不該存在龐大價差。未來Google運算引擎(Compute Engine)服務將砍價約32%,應用程式開發平台(App Engine)也將降價30%。

此外,Google也已於13日宣布,「雲端硬碟(Google Drive)」服務費率也將大減價,100 GB雲端儲存空間的月費從先前的4.99美元調降40%至2.99美元,1 TB的月費更從先前的49.99美元大砍80%至9.99美元;而10 TB的月費為99.99美元,使用者還可加價換取更多儲存空間。

微軟砸重本!OneDrive 提供 15GB 免費容量 付費版本也大降價 

為了持續擴大消費雲端佔有率,微軟昨天宣布,將在下個月開始,將 OneDrive 的免費容量由原本的 7 GB 提升到 15 GB,搭配原有支援各種平台的入口設計,讓使用者可以把更多資料放在雲端上頭。

此外,在 OneDrive 付費服務部分,微軟也進行了降價的動作,100 GB 從 7.49 美元降價至  1.99 美元,200 GB 則是由原本  11.4 美元降至 3.99 美元。另外針對 Office 365 的訂閱服務客戶,微軟將提供每位用戶 1TB 的線上儲存空間。



2014年3月15日 星期六

由科技與策略趨勢看:大數據及資料之關係將改變企業的未來 ( Big Data Will Transform The Future Of Business )[ビッグデータは、ビジネスの未来を変えていく] - Genome, Big Data With NORA And Data analysis formed knowledge will be another business in future )

Philip Evans is a senior partner and managing director in BCG’s Boston office. He founded BCG’s
media sector and has consulted for corporations worldwide in the financial services, consumer goods, media, and high-technology industries. He has also advised governments on military strategy, homeland security, and national economic policy.

Philip's long-term interest is the relation between information technology and business strategy. He is a coauthor of, among other publications, four Harvard Business Review articles, one of which, “Strategy and the New Economics of Information,” won HBR’s McKinsey Award. Blown to Bits (coauthored with Tom Wurster) was the best-selling book worldwide on technology and strategy in 2000 and has been translated into 13 languages. He is a frequent speaker on technology and strategy at industry, corporate, and academic conferences and has given keynote addresses at events convened by Bill Gates, Michael Milken, and the World Economic Forum. ( 菲利普的長遠專注於資訊技術和業務策略之間的關係。他是一個四篇哈佛商業評論出版文章中之合著作者,其中之一,“策略與資訊產業的新經濟學”榮獲哈佛商業評論的麥肯錫獎。被炸成碎片(合著與湯姆·沃斯特)是最暢銷的技術和戰略的全球本書於2000年,已被翻譯成13種語言。他常於企業和學術會議中對技術和戰略主題做發言,並給予比爾·蓋茨、邁克爾·米爾肯、及世界經濟論壇召開的事件上演講。)

I'm going to talk a little bit about strategy and its relationship with technology. We tend to think of business strategy as being a rather abstract body of essentially economic thought, perhaps rather timeless. I'm going to argue that, in fact, business strategy has always been premised on assumptions about technology, that those assumptions are changing, and, in fact, changing quite dramatically, and that therefore what that will drive us to is a different concept of what we mean by business strategy. ( 我要談一點關於策略及其與技術的關係。我們傾向於認為企業策略的作為基本經濟思想的一個相當抽象的主體,也許是相當歷久彌新。我會爭辯說,事實上,經營策略一直前提是對技術的假設,這些假設發生變化,而且,事實上,變化相當巨烈,因此,這巨烈變化將推動我們不同的概念也就是我們所說的經營策略。)

Let me start, if I may, with a little bit of history. The idea of strategy in business owes its origins to two intellectual giants: Bruce Henderson, the founder of BCG, and Michael Porter, professor at the Harvard Business School. Henderson's central idea was what you might call the Napoleonic idea of concentrating mass against weakness, of overwhelming the enemy. What Henderson recognized was that, in the business world, there are many phenomena which are characterized by what economists would call increasing returns -- scale, experience. The more you do of something, disproportionately the better you get. And therefore he found a logic for investing in such kinds of overwhelming mass in order to achieve competitive advantage. And that was the first introduction of essentially a military concept of strategy into the business world. ( Note:  Bruce Henderson's reference on  business strategy and Michael Porter Value chain )
( 讓我開始,如果可以的話,講一點點經營策略歷史。策略在企業的理念起源於兩個思想巨人:BCG的創始人 "布魯斯·亨德森"及哈佛商學院教授 "邁克爾·波特"。亨德森的中心思想是你可以稱之為大規模集中打擊及壓倒了弱勢的敵人之拿破崙戰略。亨德森公認為,在商業世界中,有許多現象其特點是經濟學家稱之報酬遞增 - 規模之經驗。你做的更多的努力,你得到更好的報酬。因此,他發現,投資於這類種壓倒性的量,來實現競爭優勢的邏輯。本質上,這是第一個引進軍事概念戰略進入商業世界。)

Porter agreed with that premise, but he qualified it. He pointed out, correctly, that that's all very well, but businesses actually have multiple steps to them. They have different components, and each of those components might be driven by a different kind of strategy. A company or a business might actually be advantaged in some activities but disadvantaged in others. He formed the concept of the value chain, essentially the sequence of steps with which a, shall we say, raw material, becomes a component, becomes assembled into a finished product, and then is distributed, for example, and he argued that advantage accrued to each of those components,
and that the advantage of the whole was in some sense the sum or the average of that of its parts. And this idea of the value chain was predicated on the recognition that what holds a business together is transaction costs, that in essence you need to coordinate, organizations are more efficient at coordination than markets, very often, and therefore the nature and role and boundaries of the cooperation are defined by transaction costs. It was on those two ideas, Henderson's idea of increasing returns to scale and experience, and Porter's idea of the value chain, encompassing heterogenous elements, that the whole edifice of business strategy was subsequently erected. ( 波特同意這個前提,但他加入一些條件來合格化。他正確地指出,那這一切都很好,但企業實際上有多個步驟。它們具有不同的組件,並且每個組件可以由不同的策略來驅動。一個公司或一個企業實際上是有些策略及組件是優勢,但有些是弱勢。他形成了價值鏈,本質上,我們可以說成幾步序列,一個原材料成為一個組件,再經組裝為成品,然後再經通路經銷,例如,他認為優勢是在每個組件,而整個的優勢是在某種意義上是所有組件總和,或說大部分平均組件的總和。而這個想法在價值鏈中認識到持有生意往來是交易成本,本質上,這在需要更有效的協調組織來超越市場,很多時候,合作中角色界線是定義為交易成本。這正是兩種觀念,亨德森的報酬遞增的規模和經驗及波特的價值鏈理念,涵蓋異質元素並豎立整個經營策略的大架構。)

Now what I'm going to argue is that those premises are, in fact, being invalidated. First of all, let's think about transaction costs. There are really two components to transaction costs. One is about processing information, and the other is about communication. These are the economics of processing and communicating as they have evolved over a long period of time. As we all know from so many contexts, they have been radically transformed since the days when Porter and Henderson first formulated their theories. In particular, since the mid-'90s, communications costs have actually been falling even faster than transaction costs, which is why communication, the Internet, has exploded in such a dramatic fashion. Now, those falling transaction costs have profound consequences, because if transaction costs are the glue that hold value chains together, and they are falling, there is less to economize on. There is less need for vertically integrated organization, and value chains at least can break up. They needn't necessarily, but they can. In particular, it then becomes possible for a competitor in one business to use their position in one step of the value chain in order to penetrate or attack or disintermediate the competitor in another. ( 當這些下降的交易成本產生深遠的影響,因為如果交易成本持有價值鏈是黏一起的,交易成本正在下降,就有較少的節約。這裡是垂直整合的組織需求下降,垂直整合的價值鏈至少可以分手。他們不一定需要,但是他們可以有。特別是,它那麼對於競爭對手就有可能在這一個行業中使用他們的位置在價值鏈中的一個步驟與組織,以在另一個滲透、攻擊另一個競爭對手或跳過它的通路。)

That is not just an abstract proposition. There are many very specific stories of how that actually happened. A poster child example was the encyclopedia business. The encyclopedia business in the days of leatherbound books was basically a distribution business. Most of the cost was the commission to the salesmen. The CD-ROM and then the Internet came along, new technologies made the distribution of knowledge many orders of magnitude cheaper, and the encyclopedia industry collapsed. It's now, of course, a very familiar story. This, in fact, more generally was the story of the first generation of the Internet economy. It was about falling transaction costs breaking up value chains and therefore allowing disintermediation, or what we call deconstruction. ( 在 leather bound 書的時代,百科全書業務基本上是一個分銷業務。大部分的費用是佣金的推銷員。CD-ROM後,網際網路出現了,新技術形成很多規模更便宜的訂單分佈之知識,百科全書行業崩潰。當然現在這是一個很熟悉的故事。其實,這更普遍的說這是第一代互聯網經濟的故事。是交易成本下降打破了價值鏈,因此允許跳過中介,或者我們稱之為通路之解構。)

One of the questions I was occasionally asked was, well, what's going to replace the encyclopedia The Wikipedia, of course, is an encyclopedia created by its users. And this, in fact, defines what you might call the second decade of the Internet economy, the decade in which the Internet as a noun became the Internet as a verb. It became a set of conversations, the era in which user-generated content and social networks became the dominant phenomenon. Now what that really meant in terms of the Porter-Henderson framework was the collapse of certain kinds of economies of scale. It turned out that tens of thousands of autonomous individuals writing an encyclopedia could do just as good a job, and certainly a much cheaper job, than professionals in a hierarchical organization. So basically what was happening was that one layer of this value chain was becoming fragmented, as individuals could take over where organizations were no longer needed. ( 其中一個我偶爾問的問題是,這是怎麼回事,誰取代百科全書大英百科全書讓不再有商業模式?前一段的答案變得明顯。我們知道它是什麼:它是維基百科。現在有什麼特別之處維基百科是不是它的分佈。有什麼特別的維基百科是它的生產方式。維基百科,當然是由其用戶創建的百科全書。而這一點,其實你可以稱之為互聯網經濟的第二個十年,這十年中,互聯網作為一個名詞成為了互聯網作為一個動詞。它成為一組對話,其中用戶生成內容和社交網絡成為佔主導地位的現象的時代。現在,究竟意味著波特 - 恆基框架中是某些種類的規模經濟的崩潰。原來,自主的幾萬個體寫作的百科全書,一樣可以做這工作,而且肯定比在一個專業層次組織人員便宜得多。所以基本上發生了什麼事是一層這個價值鏈變得支離破碎,因為那裡已不再需要組織的個人可以接管。)
Facebook business model in business model canvas
when Britannica no longer has a business model? And it was a while before the answer became manifest. Now, of course, we know what it is: it's the Wikipedia. Now what's special about the Wikipedia is not its distribution. What's special about the Wikipedia is the way it's produced.

But there's another question that obviously this graph poses, which is, okay, we've gone through two decades -- does anything distinguish the third? And what I'm going to argue is that indeed something does distinguish the third, and it maps exactly on to the kind of Porter-Henderson logic that we've been talking about. And that is, about data. If we go back to around 2000, a lot of people were talking about the information revolution, and it was indeed true that the world's stock of data was growing, indeed growing quite fast. but it was still at that point overwhelmingly analog. We go forward to 2007, not only had the world's stock of data exploded, but there'd been this massive substitution of digital for analog. And more important even than that, if you look more carefully at this graph, what you will observe is that about a half of that digital data is information that has an I.P. address. It's on a server or it's on a P.C. But having an I.P. address means that it can be connected to any other data that has an I.P. address. It means it becomes possible to put together half of the world's knowledge in order to see patterns, an entirely new thing. If we run the numbers forward to today, it probably looks something like this. We're not really sure. If we run the numbers forward to 2020, we of course have an exact number, courtesy of IDC. It's curious that the future is so much more predictable than the present. And what it implies is a hundredfold multiplication in the stock of information that is connected via an I.P. address. Now, if the number of connections that we can make is proportional to the number of pairs of data points, a hundredfold multiplication in the quantity of data is a ten-thousandfold multiplication in the number of patterns that we can see in that data, this just in the last 10 or 11 years. This, I would submit, is a sea change, a profound change in the economics of the world that we live in.

Now, what does that imply in terms of business? Well, I got a hint of this some years ago. Back in around 2003 or so, I was doing some consulting for the Pentagon, of all august institutions, on the subject of network-centric warfare, and in that context I met a gentleman called Jeff Jonas, a brilliant engineer who had made his fortune designing the security systems in Las Vegas. Jeff said to me, "Next time you're in Las Vegas, Philip, why don't you stop by and I'll take you on the tour. You can meet NORA. NORA will show you a good time." NORA was not his girlfriend. NORA is the Non-Obvious Relational Awareness system, a real-time fraud control system developed by Jeff, which supports all of the casinos in Las Vegas. We were in the security room of the Bellagio Hotel in Las Vegas, and on the monitor I saw this happen. A woman was playing blackjack against the dealer. There was nobody else at the table. She was winning too much. They know how likely that is, and this wasn't likely. So the first thing they do is they use facial recognition, see if she's staying at the hotel. She wasn't. Then they can kind of run the cameras backwards, tracing her movements back through the hotel to the parking garage, where they found her car. They could then run NORA to find who owned the car. The car was owned by Hertz Las Vegas. Within a second or so, NORA pulled down the Hertz Las Vegas application. Now they knew who the woman was. Where was she staying? Well, they pool the data across the hotels. It turned out she was staying in a hotel across the street. Had she gambled in that hotel? No. Very strange behavior, staying in one hotel, gambling in another. Then came the really interesting thing. NORA looked for a connection between the woman and the dealer, because a very high fraction of fraud in Las Vegas is committed when the staff are actually in illicit collaboration with customers. It turned out, what NORA did was to look through 6,000 databases, public and private, some owned by the Bellagio, some by other hotels, some police records, and so on. It turned out that 10 years earlier, this woman's brother had been the dealer's roommate. And it took NORA six seconds to work that fact out. It cost the woman and the dealer six years. This was NORA in action. It's what today of course we would call big data, long before the term had been formulated.

Now notice some very interesting things about this, most of all the fact that NORA runs as a cooperative across the entire of the strip. These casinos, which are otherwise competing aggressively with each other actually collaborate when it comes to the management of their security systems. They pool data into a common database that is run essentially as a co-op for this specific purpose. Why? Because the scale of NORA, what NORA is trying to do, blows past the scale that even a very large casino can possibly do for itself. The value chain is not big enough to accommodate the economies of scale that are inherent in this particular activity. And that principle, I would suggest, is actually a fundamental and pervasive one. In essence, what happens is that because of these colossal economies of scale in data, what used to be value chains that ran separately are compelled, in order to achieve those economies of scale, to create some kind of common utility, some common resource, a co-op, a pool, a vault of data within which those insights can be gathered.

Now, NORA is a relatively trivial example in the sense that if NORA failed, it wouldn't exactly be the end of civilization. But consider something vastly more important, where the logic in fact is exactly the same, the logic of healthcare. The first human genome, that of James Watson, was mapped as the culmination of the Human Genome Project in the year 2000, and it took about 200 million dollars and about 10 years of work to map just one person's genomic makeup. Since then, the costs of mapping the genome have come down. In fact, they've come down in recent years very dramatically indeed, to the point where the cost is now below 1,000 dollars, and it's confidently predicted that by the year 2015 it will be below 100 dollars -- a five or six order of magnitude drop in the cost of genomic mapping in just a 15-year period, an extraordinary phenomenon. Now, in the days when mapping a genome cost millions, or even tens of thousands, it was basically a research enterprise. Scientists would gather some representative people, and they would see patterns, and they would try and make generalizations about human nature and disease from the abstract patterns they find from these particular selected individuals. But when the genome can be mapped for 100 bucks, 99 dollars while you wait, then what happens is, it becomes retail. It becomes above all clinical. You go the doctor with a cold, and if he or she hasn't done it already, the first thing they do is map your genome, at which point what they're now doing is not starting from some abstract knowledge of genomic medicine and trying to work out how it applies to you, but they're starting from your particular genome.

Now think of the power of that. Think of where that takes us when we can combine genomic data with clinical data with data about drug interactions with the kind of ambient data that devices like our phone and medical sensors will increasingly be collecting. Think what happens when we collect all of that data and we can put it together and use precisely the NORA-type techniques in order to find patterns we wouldn't see before. This, I would suggest, perhaps it will take a while, but this will drive a revolution in medicine. Fabulous, lots of people talk about this.

But there's one thing that doesn't get much attention. How is that model of colossal sharing across all of those kinds of databases compatible with the business models of institutions and organizations and corporations that are involved in this business today? If your business is based on proprietary data, if your competitive advantage is defined by your data, how on Earth is that company or is that society in fact going to achieve the value that's implicit in the technology? They can't.

So essentially what's happening here, and genomics is merely one example of this, is that technology is driving the natural scaling of the activity beyond the institutional boundaries within which we have been used to thinking about it, and in particular beyond the institutional boundaries in terms of which business strategy as a discipline is formulated. The basic story here is that what used to be vertically integrated, oligopolistic competition among essentially similar kinds of competitors is evolving, by one means or another, from a vertical structure to a horizontal one. Why is that happening? It's happening because transaction costs are plummeting and because scale is polarizing. The plummeting of transaction costs weakens the glue that holds value chains together, and allows them to separate. The polarization of scale economies towards the very small -- small is beautiful -- allows for scalable communities to substitute for conventional corporate production. The scaling in the opposite direction, towards things like big data, drive the structure of business towards the creation of new kinds of institutions that can achieve that scale. But either way, the typically vertical structure gets driven to becoming more horizontal.

The logic isn't just about big data. If we were to look, for example, at the telecommunications industry, you can tell the same story about fiber optics. If we look at the pharmaceutical industry, or, for that matter, university research, you can say exactly the same story about so-called "big science." And in the opposite direction, if we look, say, at the energy sector, where all the talk is about how households will be efficient producers of green energy and efficient conservers of energy, that is, in fact, the reverse phenomenon. That is the fragmentation of scale because the very small can substitute for the traditional corporate scale.

Either way, what we are driven to is this horizontalization of the structure of industries, and that implies fundamental changes in how we think about strategy. It means, for example, that we need to think about strategy as the curation of these kinds of horizontal structure, where things like business definition and even industry definition are actually the outcomes of strategy, not something that the strategy presupposes. It means, for example, we need to work out how to accommodate collaboration and competition simultaneously. Think about the genome. Think about NORA We need to accommodate the very large and the very small simultaneously. And we need industry structures that will accommodate very, very different motivations, from the amateur motivations of people in communities to maybe the social motivations of infrastructure built by governments, or, for that matter, cooperative institutions built by companies that are otherwise competing, because that is the only way that they can get to scale.

These kinds of transformations render the traditional premises of business strategy obsolete. They drive us into a completely new world. They require us, whether we are in the public sector or the private sector, to think very fundamentally differently about the structure of business, and, at last, it makes strategy interesting again.

参考




2014年3月6日 星期四

Big Data 時代來臨!海量資料分析成為新產業! ( Big Data Trend came, Data Analysis become new profitable business and industry )

思科:2018年全球行動資料傳輸量將暴增至190 EB


「行動(Mobile)」一詞過去曾經只涵蓋手機、平板電腦,現在則已泛指各式各樣的穿戴裝置,例如智慧手錶、健康狀態追蹤器、谷歌眼鏡(Google Glass)等等。思科(Cisco System)認為,這些科技新品將帶動資料傳輸需求,預期到了2018年,全球行動資料傳輸量將暴增至190 exabytes (1 exabyte等於十億gigabytes)。

CNBC、Re/code報導,思科5日在年度預測報告中指出,2018年190 exabytes的行動資料傳輸量等於較2013年的傳輸量跳增11倍,也相當於42兆份影像、或是4兆片影片。這也是2000年固網、行動網路總傳輸量的190倍。

在行動資料用量快速增加的環境下,使用這些資料的行動裝置數量也將在2018年由2013年的70億台跳升至100億台。根據聯合國統計,2018年全球人口僅76億人,意味著屆時行動裝置的數量會是全球總人口的1.4倍。

行動資料傳輸量有如此爆炸性的成長,主要得益於穿戴科技的普及,思科把智慧手錶、Google Glass等穿戴裝置統一歸類至「機器對機器的連結」類別(大多屬於物聯網領域),這個分類在2013年僅佔全球行動資料傳輸量的1%,預料到了2018年會增加至6%。另外,思科也預估,2018年全球有在使用的穿戴裝置數量將由2013年的不到2,200萬台躍升至接近1.7億台。

Big Data淘金!資料分析商Tableau賺翻 股價飆13%
Tableau provide big data analysis and visual display

海量資料(Big Data)席捲而來,資料分析軟體也開始大受歡迎,Tableau Software Inc.在這波潮流的帶動下,本季營收有望創下佳績,激勵公司股價在5日飆漲近13%。

資料分析暨視覺化軟體商Tableau 4日於美國股市盤後公佈2013年第4季(10-12月)財報:營收較2012年同期跳增95%至8,150萬美元;授權金營收年增93%至5,800萬美元;本業每股稀釋盈餘達0.20美元。根據Thomson Reuters I/B/E/S調查,分析師原本預期該公司Q4營收將達6,710萬美元,本業每股盈虧則損益兩平。

Tableau指出,Q4期間該公司擴大了與亞馬遜雲端服務「Amazon Web Services (AWS)」的合作關係,支援在AWS平台執行的Tableau伺服器,並為亞馬遜雲端資料倉儲服務「Amazon Redshift」增添新的資料連接器。

此外,Q4期間還完成179筆價值超過10萬美元的銷售訂單,並新增超過1,800個客戶帳號。

Tableau執行長兼總裁Christian Chabot在新聞稿中表示,市場商機仍在不斷增加當中,該公司希望在2014年協助更多客戶透過資料洞悉未來營運的方針。

Thomson Reuters、barron`s.com報導,Tableau 在電話會議上預估本季(1-3月)營收將介於6,100-6,300萬美元之間。根據Thomson Reuters I/B/E/S調查,分析師原本預期該公司本季營收僅將達6,030萬美元,Tableau 5日聞訊大漲12.82%,收89.61美元,創2013年5月17日掛牌以來收盤新高;掛牌迄今漲幅已達76.6%。
海量資料需求夯,除了Tableau創下營運佳績外,磁碟陣列大廠Dot Hill Systems Corp.也在1月9日調高上季財測,這激勵該公司當日盤中股價狂漲逾25%。

Dot Hill Systems執行長Dana Kammersgard當時在聲明稿中表示,2013年期間該公司與許多垂直市場的重量級客戶敲定合作契約,這會成為2014年在海量資料、石油與天然氣、多媒體暨娛樂業中繼續成長的主要基石。

海量資料分析架構Hadoop每年成長55% 創辦人吃驚

儲存、處理、分析海量資料(Big Data)的開放技術架構「Hadoop」已成為產業最熱門的關鍵字,研究估計相關市場每年的成長率高達55%,速度之快連創辦人 Doug Cutting 都嚇了一跳。

英國商業科技新聞網站Computing 7日報導,根據市調機構Transparency Market Research的預估,全球Hadoop市場目前平均以每年近55%的速度快速成長,預計2018年市場規模將上看209億美元。對此,Cutting在接受專訪時表示,他當時完全不知道 Hadoop 會獲得如此大的迴響,就連甲骨文(Oracle)、IBM與微軟(Microsoft)也都還未發覺管理、儲存海量資料會有如此高的需求。

Cutting指出,開放原始碼(open source)是Hadoop能夠續存並強化的關鍵,人們不喜歡太過依Google Inc.)以及大學實驗室卻能藉此贏得優勢。
賴由某一業者控制的平台,但這種作法卻非上述傳統軟體業者所長,而雅虎(Yahoo!)、谷歌(

如今,Hadoop已成為全球企業儲存、處理與分析海量資料時的核心技術,且目前還在繼續演進當中。Cutting在2009年離開雅虎,接手Hadoop開發商Cloudera的架構長一職。目前,財星五百大企業的生產專案中,已有60%採用Cloudera的技術。Cutting表示,目前Cloudera大多數的業務成長都來自現有客戶,未來替換需求應該會高於新客戶所帶來的業務量。

IDC:2012-16年台灣巨量資料市場CAGR達40%
world wide big data revenue by service segments

IDC 2012-2016年亞太區(不含日本)巨量資料市場分析與預測報告(APEJ Big Data Technology and Services 2012-2016 Forecast and Analysis)研究顯示,亞太區(不含日本)的巨量資料市場成長力道強勁,估市場規模將從2012年的3億美元成長到2016年的17.6億美元,年複合成長率(CAGR)達47%。另台灣巨量資料市場規模則估可從2012年的1130萬美元成長到2016年的4610萬美元,年複合成長率也達到40%。

IDC亞太區巨量資料與分析研究總監Craig Stires指出,面對仍在持續不斷演進的巨量資料分析技術以及多元的市場需求,廠商們除持續不斷推出各種解決方案,亦積極於投資、訓練合作夥伴,以提供企業客戶更完整的諮詢顧問服務。而根據IDC的觀察,目前以金融、電信、政府、零售、製造與能源產業對於數據資料分析的需求最為強勁。

同時台灣市場對於巨量資料分析技術需求也在逐步提高。IDC台灣市場分析師蔡宜秀指出:「台灣的金融、製造、電信與零售等企業客戶正在嘗試、或者是計畫透過巨量資料分析技術,優化營運績效。在越來越多的企業客戶紛紛釋出對巨量資料分析技術的興趣後,將有更多的本土廠商、諮詢顧問投入該塊市場,進而逐步完善台灣的巨量資料生態系統。」

分析
Enhanced by Zemanta