Valleywag – valleywag.wordpress.com

Archive for January 6th, 2009

My guess is this has nothing to do with the phishing attacks that started on Twitter a couple of days ago. But a few minutes ago the official Fox News Twitter account posted “Breaking: Bill O Riley is gay” (referring to the host of the popular Fox show O’Reilly Factor), right after a legitimate message about making turkey lettuce wraps.

My guess is they’re just finding out about it now, and realizing their password, which was probably “password,” has been changed. Twitter will promptly restore the account to its rightful owners, I’m sure. But here’s my question – if you’ve had your Twitter account hacked, how long did it take you to get it back?

Update: Ok, this is turning into a coordinated attack or one heck of a coincidence. The official Britney Spears Twitter account (which launched in October) also appears to have been hacked Any others?

Update 2: We’ve got another winner: Rick Sanchez from CNN, who’s apparently not going to make it to work today because he’s “high on crack right now.”

Update 3: Et Tu, Facebook? (see comments for bonus hack on Obama’s account)

Update 4: Next! Huffington Post goes down too:


Update 5: 33 accounts were hacked after Twitter’s internal admin tools were compromised.


Clickry.com Post Source Link

http://www.Clickry.com

clickry.blogspot.com

clickry.wordpress.com

My guess is this has nothing to do with the phishing attacks that started on Twitter a couple of days ago. But a few minutes ago the official Fox News Twitter account posted “Breaking: Bill O Riley is gay” (referring to the host of the popular Fox show O’Reilly Factor), right after a legitimate message about making turkey lettuce wraps.

My guess is they’re just finding out about it now, and realizing their password, which was probably “password,” has been changed. Twitter will promptly restore the account to its rightful owners, I’m sure. But here’s my question – if you’ve had your Twitter account hacked, how long did it take you to get it back?

Update: Ok, this is turning into a coordinated attack or one heck of a coincidence. The official Britney Spears Twitter account (which launched in October) also appears to have been hacked Any others?

Update 2: We’ve got another winner: Rick Sanchez from CNN, who’s apparently not going to make it to work today because he’s “high on crack right now.”

Update 3: Et Tu, Facebook? (see comments for bonus hack on Obama’s account)

Update 4: Next! Huffington Post goes down too:


Update 5: 33 accounts were hacked after Twitter’s internal admin tools were compromised.


Clickry.com Post Source Link

http://www.Clickry.com

clickry.blogspot.com

clickry.wordpress.com

At Google we are fanatical about organizing the world’s information. As a result, we spend a lot of time finding better ways to sort information using MapReduce, a key component of our software infrastructure that allows us to run multiple processes simultaneously. MapReduce is a perfect solution for many of the computations we run daily, due in large part to its simplicity, applicability to a wide range of real-world computing tasks, and natural translation to highly scalable distributed implementations that harness the power of thousands of computers.

In our sorting experiments we have followed the rules of a standard terabyte (TB) sort benchmark. Standardized experiments help us understand and compare the benefits of various technologies and also add a competitive spirit. You can think of it as an Olympic event for computations. By pushing the boundaries of these types of programs, we learn about the limitations of current technologies as well as the lessons useful in designing next generation computing platforms. This, in turn, should help everyone have faster access to higher-quality information.

We are excited to announce we were able to sort 1TB (stored on the Google File System as 10 billion 100-byte records in uncompressed text files) on 1,000 computers in 68 seconds. By comparison, the previous 1TB sorting record is 209 seconds on 910 computers.

Sometimes you need to sort more than a terabyte, so we were curious to find out what happens when you sort more and gave one petabyte (PB) a try. One petabyte is a thousand terabytes, or, to put this amount in perspective, it is 12 times the amount of archived web data in the U.S. Library of Congress as of May 2008. In comparison, consider that the aggregate size of data processed by all instances of MapReduce at Google was on average 20PB per day in January 2008.

It took six hours and two minutes to sort 1PB (10 trillion 100-byte records) on 4,000 computers. We’re not aware of any other sorting experiment at this scale and are obviously very excited to be able to process so much data so quickly.

An interesting question came up while running experiments at such a scale: Where do you put 1PB of sorted data? We were writing it to 48,000 hard drives (we did not use the full capacity of these disks, though), and every time we ran our sort, at least one of our disks managed to break (this is not surprising at all given the duration of the test, the number of disks involved, and the expected lifetime of hard disks). To make sure we kept our sorted petabyte safe, we asked the Google File System to write three copies of each file to three different disks.
Clickry.com Post Source Link

http://www.Clickry.com

clickry.blogspot.com

clickry.wordpress.com

One of the first posts I wrote for this blog last summer tried to define what we at Google mean when we talk about the concept of net neutrality.

Broadband providers — the on-ramps to the Internet — should not be allowed to prioritize traffic based on the source, ownership or destination of the content. As I noted in that post, broadband providers should have the flexibility to employ network upgrades, such as edge caching. However, they shouldn’t be able to leverage their unilateral control over consumers’ broadband connections to hamper user choice, competition, and innovation. Our commitment to that principle of net neutrality remains as strong as ever.

Some critics have questioned whether improving Web performance through edge caching — temporary storage of frequently accessed data on servers that are located close to end users — violates the concept of network neutrality. As I said last summer, this myth — which unfortunately underlies a confused story in Monday’s Wall Street Journal — is based on a misunderstanding of the way in which the open Internet works.

Edge caching is a common practice used by ISPs and application and content providers in order to improve the end user experience. Companies like Akamai, Limelight, and Amazon’s Cloudfront provide local caching services, and broadband providers typically utilize caching as part of what are known as content distribution networks (CDNs). Google and many other Internet companies also deploy servers of their own around the world.

By bringing YouTube videos and other content physically closer to end users, site operators can improve page load times for videos and Web pages. In addition, these solutions help broadband providers by minimizing the need to send traffic outside of their networks and reducing congestion on the Internet’s backbones. In fact, caching represents one type of innovative network practice encouraged by the open Internet.

Google has offered to “colocate” caching servers within broadband providers’ own facilities; this reduces the provider’s bandwidth costs since the same video wouldn’t have to be transmitted multiple times. We’ve always said that broadband providers can engage in activities like colocation and caching, so long as they do so on a non-discriminatory basis.

All of Google’s colocation agreements with ISPs — which we’ve done through projects called OpenEdge and Google Global Cache — are non-exclusive, meaning any other entity could employ similar arrangements. Also, none of them require (or encourage) that Google traffic be treated with higher priority than other traffic. In contrast, if broadband providers were to leverage their unilateral control over consumers’ connections and offer colocation or caching services in an anti-competitive fashion, that would threaten the open Internet and the innovation it enables.

Despite the hyperbolic tone and confused claims in Monday’s Journal story, I want to be perfectly clear about one thing: Google remains strongly committed to the principle of net neutrality, and we will continue to work with policymakers in the years ahead to keep the Internet free and open.

One of the first posts I wrote for this blog last summer tried to define what we at Google mean when we talk about the concept of net neutrality.

Broadband providers — the on-ramps to the Internet — should not be allowed to prioritize traffic based on the source, ownership or destination of the content. As I noted in that post, broadband providers should have the flexibility to employ network upgrades, such as edge caching. However, they shouldn’t be able to leverage their unilateral control over consumers’ broadband connections to hamper user choice, competition, and innovation. Our commitment to that principle of net neutrality remains as strong as ever.

Some critics have questioned whether improving Web performance through edge caching — temporary storage of frequently accessed data on servers that are located close to end users — violates the concept of network neutrality. As I said last summer, this myth — which unfortunately underlies a confused story in Monday’s Wall Street Journal — is based on a misunderstanding of the way in which the open Internet works.

Edge caching is a common practice used by ISPs and application and content providers in order to improve the end user experience. Companies like Akamai, Limelight, and Amazon’s Cloudfront provide local caching services, and broadband providers typically utilize caching as part of what are known as content distribution networks (CDNs). Google and many other Internet companies also deploy servers of their own around the world.

By bringing YouTube videos and other content physically closer to end users, site operators can improve page load times for videos and Web pages. In addition, these solutions help broadband providers by minimizing the need to send traffic outside of their networks and reducing congestion on the Internet’s backbones. In fact, caching represents one type of innovative network practice encouraged by the open Internet.

Google has offered to “colocate” caching servers within broadband providers’ own facilities; this reduces the provider’s bandwidth costs since the same video wouldn’t have to be transmitted multiple times. We’ve always said that broadband providers can engage in activities like colocation and caching, so long as they do so on a non-discriminatory basis.

All of Google’s colocation agreements with ISPs — which we’ve done through projects called OpenEdge and Google Global Cache — are non-exclusive, meaning any other entity could employ similar arrangements. Also, none of them require (or encourage) that Google traffic be treated with higher priority than other traffic. In contrast, if broadband providers were to leverage their unilateral control over consumers’ connections and offer colocation or caching services in an anti-competitive fashion, that would threaten the open Internet and the innovation it enables.

Despite the hyperbolic tone and confused claims in Monday’s Journal story, I want to be perfectly clear about one thing: Google remains strongly committed to the principle of net neutrality, and we will continue to work with policymakers in the years ahead to keep the Internet free and open.

When I look back on four years of tracking Old St. Nick on Christmas Eve, I can’t help but smile. The Santa tracker has really come a long way. I always thought NORAD’s Santa Tracker was a great holiday tradition, but I felt like it could have been even better if people could visualize exactly where Santa was on Christmas Eve. So in 2004, shortly after Keyhole was acquired by Google, we followed Santa in the “Keyhole Earth Viewer” — Google Earth’s original name — and we called it the “Keyhole Santa Radar.” The audience was relatively small since Keyhole was still a for-pay service at that point, and we hosted everything on a single machine shared with the Keyhole Community BBS server. We probably should have had three separate servers to host the Santa tracker — that first year, we had only a portion of a single machine. That night, about 25,000 people kept tabs on Santa and, needless to say, wreaked some havoc on our servers!

Over the next two years, our Santa-tracking efforts improved dramatically. By December 2005, Keyhole had become Google Earth and our audience had become much, much larger. Our “Santa Radar” team also grew: we used greatly improved icons from Dennis Hwang, the Google Doodler, and set up 20 machines to serve the tracking information. My colleague Michael Ashbridge took over the software and more than 250,000 people tracked Santa on Google Earth that Christmas Eve. In 2006, Google acquired SketchUp, a 3D modeling software that enabled us to include models of Santa’s North Pole workshop and sleigh. We also incorporated a tracking feed directly from NORAD’s headquarters, and we were now displaying NORAD’s information in Google Earth. That year, more than a million people tracked Santa.

In 2007, Google became NORAD’s official Santa Tracking technology partner and hosted www.noradsanta.org. In addition to tracking Santa in Google Earth, we added a Google Maps tracker and integrated YouTube videos into the journey as well. Now, we had Santa on the map and on “Santa Cam” arriving in several different locations around the world, with commentary in six different languages. The heavy traffic — several millions of users — put Google’s infrastructure to the test, but with some heroic work by our system reliability engineers, the Santa Tracker worked continuously.

Clickry.com Post Source Link

http://www.Clickry.com

clickry.blogspot.com

clickry.wordpress.com

The holidays are a time for giving, and Googlers across the globe have found some creative ways to give back to their communities this season. From raising money and crafting greeting cards to building gingerbread houses and giving blood, Googlers from east to west have been busy spreading good cheer. We’ve highlighted just a few of these efforts here, and we’re looking forward to many more opportunities to give back in the new year.

London
The UK engineering recruitment team started to plan its annual Secret Santa gift exchange. But as they began thinking about last year, they realized that hardly anyone on the team could remember what they’d received, let alone given. Instead of spending 10 pounds on gag gifts, they decided to use the money to make a difference. After discovering that a local children’s hospital was in desperate need of gifts, they quickly raised enough money to buy a Nintendo Wii gaming console for one of the wards.


Mexico City
In the past, Google has held a “Doodle 4 Google” contest in the US, the UK, and Australia, inviting kids K-12 to submit a homepage doodle inspired by a particular theme. This year Mexico held its first such contest (theme: “the Mexico we want”). For each doodle submitted, Google donated to a non-profit that works to eradicate childhood malnutrition in Mexico. In total, more than 70,000 kilos (154,000 pounds) of food and aid were donated. Winner, Ana Karen Villagómez, was recently recognized in a ceremony in Mexico City; her doodle (pictured below) will appear on the Google homepage on January 6.


Boston and beyond
Boston Googlers delivered gifts to some very grateful students at a local school and spent the morning reading and playing with the children. The Chicago office held its first-ever holiday blood drive, donating 36 units of blood. And the Ann Arbor office held a “CANstruction” competition, creating sculptures out of canned food, personal items and baby items, which were all later donated.

Clickry.com Post Source Link

http://www.Clickry.com

clickry.blogspot.com

clickry.wordpress.com

The holidays are a time for giving, and Googlers across the globe have found some creative ways to give back to their communities this season. From raising money and crafting greeting cards to building gingerbread houses and giving blood, Googlers from east to west have been busy spreading good cheer. We’ve highlighted just a few of these efforts here, and we’re looking forward to many more opportunities to give back in the new year.

London
The UK engineering recruitment team started to plan its annual Secret Santa gift exchange. But as they began thinking about last year, they realized that hardly anyone on the team could remember what they’d received, let alone given. Instead of spending 10 pounds on gag gifts, they decided to use the money to make a difference. After discovering that a local children’s hospital was in desperate need of gifts, they quickly raised enough money to buy a Nintendo Wii gaming console for one of the wards.


Mexico City
In the past, Google has held a “Doodle 4 Google” contest in the US, the UK, and Australia, inviting kids K-12 to submit a homepage doodle inspired by a particular theme. This year Mexico held its first such contest (theme: “the Mexico we want”). For each doodle submitted, Google donated to a non-profit that works to eradicate childhood malnutrition in Mexico. In total, more than 70,000 kilos (154,000 pounds) of food and aid were donated. Winner, Ana Karen Villagómez, was recently recognized in a ceremony in Mexico City; her doodle (pictured below) will appear on the Google homepage on January 6.


Boston and beyond
Boston Googlers delivered gifts to some very grateful students at a local school and spent the morning reading and playing with the children. The Chicago office held its first-ever holiday blood drive, donating 36 units of blood. And the Ann Arbor office held a “CANstruction” competition, creating sculptures out of canned food, personal items and baby items, which were all later donated.

Clickry.com Post Source Link

http://www.Clickry.com

clickry.blogspot.com

clickry.wordpress.com

Every year right about now we round up our blogging activity across Google. Ready? Here goes.

This is our 368th post of the year on the main Google blog, which is 23% more than in 2007. In addition to more posts, we are thrilled to know that we have many more readers now — 78% more, to be exact. The number of unique visitors jumped from 6,738,830 last year to more than 12 million (12,000,723) in 2008. And readers are coming from all over: the UK, Canada, India, Australia, Germany, France, Spain, Japan and beyond. The top non-Google referrers are Yahoo, Digg, Reddit, Lifehacker and Slashdot.

We posted quite a bit about new products (10) and new product features (56), but nothing caused as much excitement as our earlier-than-planned unveiling of Google Chrome. This post alone had 1,735,093 unique visitors and generated 12% of our total-year pageviews on the blog! There was also the much-anticipated announcement of the first Android-powered phone. And people enjoyed reading about our design philosophies. Who knew a little change to a favicon would generate such interest?

But it wasn’t all just product news; there was much else to cover in 2008. To mark Google’s 10th birthday, we took a moment to reflect on the enormous impact the Internet has had on people’s lives since our founding. Some of our in-house experts shared their thoughts on how various technologies will evolve in the next 10 years.

Like many of you, we were on the edge of seats watching all of the U.S. election action. We posted 27 times about political subjects, providing information about voting tools, how the political process works, and what was top of mind on Election Day. It’s clear that technology will be playing an even bigger role in politics in years to come.

Of course, we had some fun too: We kept our long-standing April Fools’ Day tradition going with the announcement of Project Virgle; we covered new ways to get around the Googleplex and the masterminding of a giant Ferris wheel; and we raised our glass to a couple who got married with Google.

We’re looking forward to another robust year of keeping you informed of all the goings-on at Google. In the meantime, we wish you and yours a very happy New Year.

Permalink

Clickry.com Post Source Link

http://www.Clickry.com

clickry.blogspot.com

clickry.wordpress.com


Top Clicks

  • None

Blog Stats

  • 4,857 hits

Recent Comments

peter on Russian babe
www.viewmy.tv on Blinkx Dabbles in Broadband TV…

Categories

January 2009
M T W T F S S
 1234
567891011
12131415161718
19202122232425
262728293031