Category Archives: Technic

Sea-Level Economy Mapping: A big data project for future equality

Here’s something that we can use big data for today: Let’s set the socioeconomic benchmark against which society will respond to rising sea levels across all income levels.

Everyone carrying a phone today is throwing off location data that, if anonymized, collected and analyzed, would show what low-lying land is most used today. From that, we can project the potential economic disruptions that will be caused by various levels of sea level change, as many tools do today. We can look at property ownership, travel patterns, rent and home prices that will be impacted by rising water and, like the Dutch when they decided to hold the sea back, make some long-term decisions that will save everyone, not just the privileged, from personal tragedy and economic disaster as their homes, communities and job networks disappear in the waves.

When the ocean rises, and it will rise enough that many low-lying cities in 40, 80 or 120 years will be under many inches or feet of water, everyone’s lives will be disrupted. If we start tracking the use of public and private property, shared-use common areas and investments, such as the the cost of infrastructure that may be destroyed, and that which needs to be created to holds the seas back, can be mapped to provide the best outcome for all. At least, it will give everyone a baseline against which to measure the impacts. Democracy can take care of the rest, with an assist from the market, but a market-only solution will leave far too many losers.

Without some benchmark to measure the social cost of responding to climate change, the wealthiest people will almost certainly benefit disproportionately to others who live and work in flooded areas. We’ll see cries reparations for lost land from every quarter, but the rich will have the loudest voice, as we know from the state of political speech today.

The homes of the rich that line sea coasts everywhere will be lost, but so will many of the homes occupied (not necessarily owned by) the poor and middle class. Who will get the help necessary to relocate? Who will have new public right-of-ways running through their neighborhoods when existing rail and road infrastructure must be moved inland or raised above the rising seas? Will insurers make the rich whole and, like home insurance today, leave most people less than half-whole when the cost of relocation is counted?

I am not arguing that anyone get resources here, only for a measurement so that, when the crisis comes, we will have had many years, even decades to have the national and international conversation about the mass migration of people fleeing the high tide. We may decide it’s time to move past many of the institutions we rely on today.

If we’re going to go through this together, we need the data to understand the distributed social cost of lands and infrastructure — technical, industrial, social and even personal networks that currently provide support to families. The problem with this statement is that it appears naive, because we live in a society where almost everyone thinks they’ve made their way in the world alone. That myth is going to collapse as the world starts denying us land and resources we used to have.

Yet we can get through this, as humans have done many times in history, if we recognize the real costs and opportunities in radical change. Perhaps, with lots more data and people trained to think through these complex issues armed with real-time and historical perspectives provided by big data strategies, we might actually realize we are in this together.

Posted in Economic, Technic | Tagged , | Leave a comment

Small fusion project with potentially big results

Reading with interest about the Sandia National Labs’ Z machine, an electromagnetic fusion reaction generator, if it every works. More than a decade behind schedule, the project is the least heavily funded of the various fusion projects in the U.S. and Europe. Someone will make fusion work, and it will be a tremendous step forward toward sustainability. Check out how the Z uses a ring of supermagnets and tuned laser burst to compress and, hopefully ignite, a fusion reaction. It’s simplistic in its design, essentially a containment field. Let’s hope we keep funding this kind of small project instead of defaulting to one big bang approach to fusion. It will yield more results we can learn from.

Posted in Life & Everything Else, Technic | Comments Off

Productive meetings vs. the cluster-call: New collaboration thoughts

A couple weeks ago, I asked via LinkedIn and Twitter, what makes a meeting productive. The question has led me to conclude that a new type of collaborative activity, the cluster-call, is an opportunity for greater productivity, but also can be a barrier to innovation because it is not managed differently than a meeting.

For several years, a new kind of collaboration activity has been developing on the foundation of telecommunications, the cluster-call, a continuous use of partial attention via conference call. These are virtual gatherings, typically scheduled so that all the participants can be available if — and that’s the key condition — if something comes up for which they have responsibility. Cluster-calls typically involve 30 to 60 people, all of whom are splitting their attention between the call, listening of hot button issues, and some form of work or diversion. One hears of these calls as “a meeting that is getting a lot of momentum.” I often suspect that these calls are the source of the hours of social media use, or Solitaire play, that managers fear to count on their activity reports. Cluster-calls are, however, a viable form of collaboration at the right scale.

Cluster-calls work when they are not substitutes for meetings with an agenda that requires a decision. They are excellent collaboration environments in the right size and with the right scope. Teams, rather than cross-team collaboration, are best served by the constant connectivity of a cluster-call. As people continue to work, they can tap anyone in their team, or reach out individually to bring someone from another team onto the call, to address questions, discover information and brainstorm. But try to turn a cluster-call into a daily meeting, treating it like a scrum or stand-up meeting, where people use the immediacy of the agenda to get work done, and the call will degenerate into a protracted distraction from productive work.

So, how do you have a productive meeting? Or a productive cluster-call? A meeting, whether physical or virtual, is defined by its goals. A cluster-call is a setting for outcomes, but without an agenda, it becomes primarily a regulator of change. On large cluster-calls, people tend to focus on what can stop or interrupt normal business activity. They flag concerns without being obligated to provide solutions, so these kinds of gatherings are hotbeds of change prevention.

“[It] depends on the sort of meeting, but generally, when clear goals have been defined & everyone knows what they’re supposed to do,” replied Phillip Mueller, a German entrepreneur living in The Netherlands. This describes a productive use of time, it could apply to any kind of gathering.

Robert Reddick, a Charlotte, N.C. entrepreneur, offers that a productive meeting is “a place to pre-flight and execute decisions,” also a result that could come from a meeting or a cluster-call. But without an agenda, the framework for decision-making is typically absent.

A cluster-call, which is simply a way of describing being simultaneously connected to a virtual space, works great for small groups who are dealing with a lot of uncertainty. In this age of demanding competition, where people come and go from small projects, cluster-calls let people learn quickly in small groups. A scrum meeting, for instance, can be extremely productive, because people share information as the need for sharing becomes apparent. People talk about things and when someone on the call doesn’t know about the project or subject of conversation, they can ask. Often the instructions come offline, away from the cluster-call, but the group determines when that is necessary.

Small groups constantly connected can thrive. Bring 30 or 60 people together, a common practice these days in larger companies, and the productivity becomes the explorations of limits. The limits of the group’s knowledge, the limit of the group’s tolerance for new ideas, for change and the limits of the organization’s flexibility become apparent. The outcome is that everyone is quiet unless they see the need to raise a flag. It’s easier to play along and be quiet in these large meetings.

There is a breed of participation in cluster-calls: Grand-standing. It becomes a regular occurrence that a small, consistent group dominates the calls, exercising their expertise without actually intending to share that expertise. Knowledge is a crowded cluster-call is like the knowledge that drives crowds: The noisiest people keep things moving in one direction.

Meetings should be recognized as events with agendas. If the agenda isn’t addressed, that is not a productive meeting. On the other hand, a small cluster-call can work effectively  without an agenda, though it must not become routine or it will descend into unproductive activity. A regularly scheduled call of 40 to 60 people (I’ve been on these calls with up to 90 participants on several occasions), even with an agenda, becomes an exercise that reinforces the flow of information, downward due to the likelihood that any newcomer, any controversial idea will be squelched by the people most likely to grandstand.

Leadership based on the intimidating presence of a crowd that will agree quietly destroys the organization. Which is why the road from democracy to tyranny is always paved by populists, as well.

Posted in Business & Technology, Social & Political, Technic | Comments Off

Privacy, the NSA, and cranky customers

There’s a discussion the VRM project mailing list (home page here), hosted by Doc Searls and Harvard University that suggests that the erosion of personal privacy may have been planned by our government or, at least, the National Security Agency. Of course it was planned. Here’s my response to the list:

Agreed, the fingerprint reader doesn’t require nightmare scenarios involving severed fingers to justify outrage. The simple fact is that a thief could walk up to you on the street and point a gun at you, demanding that you unlock your phone. That’s a variation on the kidnapping-for-ATM-access scheme that is used in some U.S. cities and many cities in the developing world as the most convenient way to kidnap someone. The fingerprint reader simply makes that easier to do. It’s not a new threat to privacy as much as evidence of the continuing consolidation of the destruction of privacy we’ve allowed to happen.

Point of information, since I broke the original Clipper Chip story (learn more) for MacWEEK (a day before the Times): The NSA had several concurrent efforts to intrude on public crypto standards in the early 90s. They had established an advisory role with NIST (National Institute of Standards and Technology) in the late 80s with the plan of driving backdoors into all potential public crypto standards. It wasn’t a fallback strategy after Clipper was outed for its NSA backdoor, but part of a campaign on many fronts that was largely ignored by public policy people and despite complaints from privacy advocates, who do not all wear tin hats.

At the same time as Clipper, NSA was imposing its advice on NIST for the MD-5 message-digest hash algorithm that is used to generate 128-bit keys, which opened the door to what we are living with today. The NYT’s John Markoff and I both reported that, too, though our publications’ archives don’t seem to be accessible for free access. National cryptography policy has resided in the armed forces for far too long, to the point where it is negatively impacting U.S. technical competitiveness. This does not happen by accident; it was planned and abetted by both political parties.

I think it is well past time to refer to privacy advocates as shouting cranks. It trivializes legitimate citizen concern and, because of the intrusion into public communication, marginalizes consumer complaints. Customers have been waving companies like Apple and Microsoft off supporting Fort Meade for years. But these companies, both of which resisted NSA suggestions that they compromise their security in the 1990s, have continually heard their privacy-concerned customers described as cranks and edge-cases. We eroded our own privacy and sacrificed the economic power in personal information willingly.

We got here by not being activist enough, not by being too crazy to make the case that our security in the knowledge that we speak in private in our homes and papers is a 21st Century human necessity, and certainly a U.S. necessity.

Posted in Business & Technology, Technic | Comments Off

Amazon Cloud Drive: Migration Reality

Amazon Cloud Drive: Learn More. Click the link and consult the pricing table at the bottom of the page. Go ahead, I will wait. All future media purchased from Amazon will be stored free on the service, but it is the migration costs I’m looking at here.

Cloud Drive is “free” — that is you can store 5 GB of music, about 1,000 songs or 20 minutes worth of HD video before paying for additional storage. I looked up what it would cost to store my 1.6TB iTunes library: $1,600 a year.

My Zune library files, both the music I’ve purchased and the albums I listen to, are stored and accessible as part of the music subscription service.

If I were to embrace Cloud Drive as my media storage service, I’d be adding about $1,900 a year to my media costs in order to migrate my current collection to the Amazon service.

The everywhere-access of cloud storage is a great user experience, but not if you have to count storage costs by increments of hundreds of dollars a year. Amazon needs to assess the value of a heavy media buyer over their lifetime and recognize that zero-cost migration is the price they will pay to get a shot at a new customer in in this environment.

Posted in Business & Technology, Technic | Comments Off

10-million times the safe limit

Reading this article, you’d get the impression that a 10-million times increase in radiation could possibly be bad for you.  Radiation in reactor’s building tests 10 million times above normal – CNN.com.

This exponentially dwindling amount of radiation means, according to Nishiyama, that it’s unlikely that sealife — and, several steps down the line, humans who might eat once contaminated seafood — will suffer greatly from the iodine-134 exposure.

Exponentially dwindling radiation levels mean that the radiation level started exponentially above normal radiation levels.

“Fry me to the moon,” is the lyric that comes to mind.

Posted in Technic | Comments Off

Testing BlogPress

Here is a test of an iPad blogging tool, BlogPress. And it seems to work pretty well. Would like to be able to link my Bit.ly account to the posting sent to Twitter.

Posted in Technic | Comments Off