Showing posts with label net neutrality. Show all posts
Showing posts with label net neutrality. Show all posts

August 6, 2012

Wozniak: Cloud Computing Will Cause "Horrible Problems"

"Wozniak didn't offer much in the way of specifics . . . . [but said, 't]he more we transfer everything onto the web, onto the cloud, the less we're going to have control over it.'" Steve Wozniak was the inventor of the Apple I and Apple II computers.

More at Business Insider. You can find more re- the kinds of problems I worry about by clicking on the label, "Worldbeam," at the bottom of this post.

June 24, 2011

Telecomix and We Rebuild

Telecomix seems perhaps to be a subset of We Rebuild, or a term associated with certain news and other functions? Among other efforts, Telecomix worked to provide alternate communication channels during the Mubarek regime's shut-down of Egyptian internet access (see Egypt/Main Page).



We Rebuild describes itself:

We Rebuild is a decentralized cluster of net activists who have joined forces to collaborate on issues concerning access to a free Internet without intrusive surveillance. . . . There are no leaders, nor members. We Rebuild is simply an international chaotic event, and our actions can not be predicted in detail. We are a flow of passions, and we sometimes refer to our driving force as “data-love”.

. . . . The We Rebuild initiative promotes and participates in building the Internet to be accessible for everyone everywhere, enabling true freedom of speech. This is something which can not be guaranteed by states or corporations, but requires the polyvocal voice of the Internet. You will run in to us when you least expect it, especially if you are making decisions about the Internet. But since our strategies are based in the passionate sharing of ideas, you will most likely be happy to see us.

More at the WeRebuild wiki and the Telecomix News Agency; see also Datalove.

May 27, 2011

Why We Need Net Neutrality: ISP's Are Already Throttling the Internet

If you had any remaining doubt as to whether and how much this is happening, wonder no more. Two projects have shown that Comcast and Road Runner consistently engage in substantial, discriminatory slowing or throttling of internet traffic (euphemistically referred to as "shaping") both to and from users, and that "Comcast, Road Runner (from Time Warner Cable), and Cox all use downstream shaping." (More at Boing Boing)

They claim they only do it to help manage traffic volumes. But there are many examples of known, wrongful censorship of political or other content, within as well as outside the U.S.; see, e.g., here (Comcast and/or Symantec blocked all e-mails containing URL of site calling for investigation into whether Pres. Bush committed impeachable offenses in connection with the push to invade Iraq, successfully reducing the impact of activists' efforts), here (AT&T censored Pearljam concert by deleting lyrics criticizing Bush), or here (Mindspring and OneNet Communications, successively, blocked site hosting Nuremburg Files).

As Lawrence Lessig stated in a recent article, "The innovation commons of the Internet threatens important and powerful pre-Internet interests. During the past five years, those interests have mobilized to launch a counterrevolution that is now having a global impact."

Bad enough that, as "Napoleon said . . . it wasn't necessary to completely suppress the news; it was sufficient to delay the news until it no longer mattered." But worse, ISP's can suppress any info they choose in ways that make it unlikely that many users will ever become aware that anything's been filtered out.

And as stated at sp!ked-IT with reference to Wikileaks, among others, "when an ISP removes [or blocks] content, it invokes the cyber equivalent to the death sentence. When an ISP acts it can effectively destroy a business or censor a political campaign, by making access to that website impossible.

If you agree that protecting the internet as a source of uncensored political information is one of the most urgent issues of our time, please spread the word about it.

January 21, 2011

The Commons that Was the Internet, & Why the Creative Explosion It Gave Us May Soon Be Over

Lawrence Lessig has a new article at Foreign Policy summarizing important factors behind the explosive growth of the Internet, and the imminent threats that could end it:

A “commons” is a resource to which everyone within a relevant community has equal access. It is a resource that is not, in an important sense, “controlled.” Private or state-owned property is a controlled resource; only as the owner specifies may that property be used. But a commons is not subject to this sort of control. Neutral or equal restrictions may apply to it (an entrance fee to a park, for example) but not the restrictions of an owner. A commons, in this sense, leaves its resources “free.”

. . . . But within American intellectual culture, commons are treated as imperfect resources. They are the object of “tragedy,” as ecologist Garrett Hardin famously described. Wherever a commons exists, the aim is to enclose it. . . .

For most resources, for most of the time, the bias against commons makes good sense. When resources are left in common, individuals may be driven to overconsume, and therefore deplete, them. But . . . . [s]ome resources are not subject to the “tragedy of the commons” because some resources cannot be “depleted.” . . . For these resources, the challenge is to induce provision, not to avoid depletion. The problems of provision are very different from the problems of depletion—confusing the two only leads to misguided policies.

* * * * *
. . . . [T]he Internet was born at a time when a different philosophy was taking shape within computer science. This philosophy ranked humility above omniscience and anticipated that network designers would have no clear idea about all the ways the network could be used. It therefore counseled a design that built little into the network itself, leaving the network free to develop as the ends (the applications) wanted.

The motivation for this new design was flexibility. The consequence was innovation. Because innovators needed no permission from the network owner before different applications or content got served across the network, innovators were freer to develop new modes of connection. . . . Since the network was not optimized for any single application or service, the Internet remained open to new innovation. . . .

* * * * *
Every significant innovation on the Internet has emerged outside of traditional providers. . . . This trend teaches the value of leaving the platform open for innovation. Unfortunately, that platform is now under siege. Every technological disruption creates winners and losers. The losers have an interest in avoiding that disruption if they can. This was the lesson Machiavelli taught, and it is the experience with every important technological change over time. It is also what we are now seeing with the Internet. The innovation commons of the Internet threatens important and powerful pre-Internet interests. During the past five years, those interests have mobilized to launch a counterrevolution that is now having a global impact.
The article's not super-long but contains much more that's well worth reading.

UPDATE: Great audio of Lessig here discussing the policy considerations underlying copyright law and some reforms we might consider that could actually afford greater compensation to artists while de-criminalizing non-commercial re-mixing and other uses.

January 12, 2011

A Few Headlines: "Learned Helplessness" in Schools, Missing Billions, & More Media Control

1. At DU, links to info re- "CIA torture theorist working for KIPP charter schools": former American Psychological Association (APA) President Martin Seligman originated a theory re- "learned helplessness" which, if I understand correctly, involves breaking down individuals' autonomy and replacing it with uncritical compliance with authority. Seligman actively assisted in the development of the CIA’s torture techniques, and now his theories are apparently being used on students in charter schools. More at the link and at Schools Matter.

2. At The Fiscal Times, "Billions of Dollars 'Vanish' in Afghanistan." "The United States has spent more than $55 billion trying to rebuild war-torn Afganistan and win the confidence of the people, but most of that money can’t be accounted for or has been wasted on failed projects." More at the link.

3. At HuffPo, "FCC breaks Obama's promise, allows corporate censorship online with fake Net Neutrality"; more at the link.

4. The FCC and Department of Justice may be about to approve a proposed merger between Comcast and NBC Universal. Below, Al Franken explains why this would be disastrous for the rest of us and how you can help stop it.


September 6, 2010

Great Poster Explaining Net Neutrality:

Click on the image for a larger version, or go to OnlineMBA for an even bigger version.

February 23, 2010

Curating the Net

Great article at Wired re- how Google works:

Google’s engineers have discovered that some of the most important signals [re- potential improvements to Google's search algorhithm] can come from . . . [t]he data people generate when they search – what results they click on, what words they replace in the query when they’re unsatisfied, how their queries match with their physical locations . . . . The most direct example of this process is what Google calls personalized search — an opt-in feature that uses someone’s [personal] search history and location as signals to determine what kind of results they’ll find useful. . . .

Take, for instance, the way Google’s engine learns which words are synonyms. “We discovered a nifty thing very early on,” Singhal says. “People change words in their queries. So someone would say, ‘pictures of dogs,’ and then they’d say, ‘pictures of puppies.’ So that told us that maybe ‘dogs’ and ‘puppies’ were interchangeable. We also learned that when you boil water, it’s hot water. We were relearning semantics from humans, and that was a great advance.”

But there were obstacles. Google’s synonym system understood that a dog was similar to a puppy and that boiling water was hot. But it also concluded that a hot dog was the same as a boiling puppy. The problem was fixed in late 2002 by a breakthrough based on philosopher Ludwig Wittgenstein’s theories about how words are defined by context. As Google crawled and archived billions of documents and Web pages, it analyzed what words were close to each other. “Hot dog” would be found in searches that also contained “bread” and “mustard” and “baseball games” — not poached pooches. That helped the algorithm understand what “hot dog” — and millions of other terms — meant. “Today, if you type ‘Gandhi bio,’ we know that bio means biography,” Singhal says. “And if you type ‘bio warfare,’ it means biological.

One reason I'm thrilled with the internet is that through it, we're all helping Google and others create scientific models of human linguistic intelligence, among other things. I trust Google will eventually share the results of their and our efforts in this and other areas of knowledge, although I assume we'll have to pay for them.

But I'm posting mainly to try to make sure we all understand that the role played by search engines and other online intermediaries in selecting and ranking search results is absolutely critical in shaping not just our online lives, the importance of which will only continue to grow, but also our knowledge and beliefs about history, current events, etc., and thus our non-virtual realities.

(And never doubt that non-virtual realities – control over water, guns, infrastructure, energy – will continue to matter. Even the 'net needs servers and power.)

Per the OED, "curate" derives from the Latin word for "care." The primary meaning is "a member of the clergy engaged as assistant to a parish priest." The secondary definition, which I more or less mean to use here, is to "select, organize, and look after the items in (a collection or exhibition)."

That's more or less what search engines do: select and organize (rank) info on the net. (Although they don't care for it, unless you count selecting it as "care." Sometimes info survives on the net precisely so long it is overlooked, as when the info proves embarrassing to the authority that put it there. More often, the expense of keeping info on the net means that if it's ignored, it eventually disappears.)

Not only are companies like Google curating our realities, but they're not telling us what their curatorial guidelines are. They keep close secret many of the factors that determine search results. They need to do this because they're commercially-driven entities competing with others.

Doubtless all or most of the criteria incorporated into their algorithms result in better service to their users. But this secrecy also means we can never be sure we're not missing out on info that commercial intermediaries consider unimportant or even disadvantageous to them for us to find.

Less ominously, it also simply deprives us of the opportunity to critically examine and debate not only how our world is being shaped, but also whether we might want to shape it differently. That is, even if all criteria used to determine search results and the like reflect solely the users' desires, when we become aware of our criteria and desires, sometimes we decide it's worth making a conscious effort change them.

But it's virtually impossible to do that without knowing what they are.

January 16, 2008

AT&T Considering Monitoring All Internet Traffic

Mind-bogglingly bizarre: per Slate.com, "last week AT&T announced that it is seriously considering plans to examine all the traffic it carries for potential violations of U.S. intellectual property laws. The prospect of AT&T, already accused of spying on our telephone calls, now scanning every e-mail and download for outlawed content is way too totalitarian for my tastes." I'm not sure why they'd bother, if that were the only motivation; sounds more like they're floating an after-the-fact rationalization for Mark Klein's allegations.

November 27, 2007

Internet Bill of Rights?

According to eGov monitor, Italy and Brazil have "endorsed a joint declaration committing themselves to reach as soon as possible a shared and planned resolution of network rights," to be "prioritized" at the 2008 Internet Governance Forum in New Delhi.

Issues to be addressed would include privacy, data protection, freedom of expression, free access to information and knowledge, universal accessibility, network neutrality, interoperability, open standards, the right to innovate, a fair and competitive market and consumers' safeguards.

Great idea. Who gets to write it? And how can we get it in the U.S.?