Category: Edge computing

Markets as conversations with robots

From the Google AI blogTowards a Conversational Agent that Can Chat About…Anything:

In “Towards a Human-like Open-Domain Chatbot”, we present Meena, a 2.6 billion parameter end-to-end trained neural conversational model. We show that Meena can conduct conversations that are more sensible and specific than existing state-of-the-art chatbots. Such improvements are reflected through a new human evaluation metric that we propose for open-domain chatbots, called Sensibleness and Specificity Average (SSA), which captures basic, but important attributes for human conversation. Remarkably, we demonstrate that perplexity, an automatic metric that is readily available to any neural conversational models, highly correlates with SSA.

A chat between Meena (left) and a person (right).

Meena
Meena is an end-to-end, neural conversational model that learns to respond sensibly to a given conversational context. The training objective is to minimize perplexity, the uncertainty of predicting the next token (in this case, the next word in a conversation). At its heart lies the Evolved Transformer seq2seq architecture, a Transformer architecture discovered by evolutionary neural architecture search to improve perplexity.
 
Concretely, Meena has a single Evolved Transformer encoder block and 13 Evolved Transformer decoder blocks as illustrated below. The encoder is responsible for processing the conversation context to help Meena understand what has already been said in the conversation. The decoder then uses that information to formulate an actual response. Through tuning the hyper-parameters, we discovered that a more powerful decoder was the key to higher conversational quality.
So how about turning this around?

What if Google sold or gave a Meena model to people—a model Google wouldn’t be able to spy on—so people could use it to chat sensibly with robots or people at companies?

Possible?

If, in the future (which is now—it’s freaking 2020 already), people will have robots of their own, why not one for dealing with companies, which themselves are turning their sales and customer service systems over to robots anyway?

People are the real edge

You Need to Move from Cloud Computing to Edge Computing Now!, writes Sabina Pokhrel in Towards Data Science. The reason, says her subhead, is that “Edge Computing market size is expected to reach USD 29 billion by 2025.” (Source: Grand View Research.) The second person “You” in the headline is business. Not the people at the edge. At least not yet.

We need to fix that.

By we, I mean each of us—as independent individuals and as collected groups—and with full agency in both roles. The Edge Computing is both.

The article  illustrates the move to Edge Computing this way:

The four items at the bottom (taxi, surveillance camera, traffic light, and smartphone) are at the edges of corporate systems. That’s what the Edge Computing talk is about. But one of those—the phone—is also yours. In fact it is primarily yours. And you are the true edge, because you are an independent actor.

More than any device in the world, that phone is the people’s edge, because connected device is more personal. Our phones are, almost literally, extensions of ourselves—to a degree that being without one in the connected world is a real disability.

Given phones importance to us, we need to be in charge of whatever edge computing happens there. Simple as that. We cannot be puppets at the ends of corporate strings.

I am sure that this is not a consideration for most of those working on cloud computing, edge computing, or moving computation from one to the other.

So we need to make clear that our agency over the computation in our personal devices is a primary design consideration. We need to do that with tech, with policy, and with advocacy.

This is not a matter of asking companies and governments to please give us some agency. We need to create that agency for ourselves, much as we’ve learned to walk, talk and act on our own. We don’t have “Walking as a Service” or “Talking as a Service.” Because those are only things an individual human being can do. Likewise there should be things only an individual human with a phone can do. On their own. At scale. Across all companies and governments.

Pretty much everything written here and tagged VRM describes that work and ways to approach that challenge.

Recently some of us (me included) have been working to establish Me2B as a better name for VRM than VRM.  It occurs to me, in reading this piece, that the e in Me2B could stand for edge. Just a thought.

If we succeed, there is no way edge computing gets talked about, or worked on, without respecting the Me’s of the world, and their essential roles in operating, controlling, managing and otherwise making the most of those edges—for the good of the businesses they deal with as well as themselves.

 

 

© 2020 ProjectVRM

Theme by Anders NorenUp ↑