The Longest Now


Designing life for episodic tyranny | 2: Social networks
Saturday November 12th 2016, 8:17 pm
Filed under: Uncategorized

For background, see also Part 1: Secure toolchains

Motivation

Imagine a Stasinario: while in a Tier 3 environment, you expect your social networks to be subverted, with people pressured to report on one another, and casual gatherings discouraged or explicitly outlawed.  Your contact with local colleagues and neighbors is always tinged with the certainty that eventually, one of them will report on the others, if only to stay out of trouble themselves. 
Assume that a few community members will be willing informants, and that everyone else would rather not inform, but will periodically be questioned by an adversary trying to prevent organizing or information-passing of any kind.  When questioned, you will be punished for sharing any information that can be shown to be false.  What sorts of preparation can you make in advance, for both offline and online gatherings? [Input needed from people facing this in closed systems, and in heavily-monitored activist movements.]

Social design options

1. Make gathering information more expensive.  Add plausible noise to the system; report frequently rather than rarely?  

2.  Human ddos/noise: instead of LOIC [Anons], have collective noise generation pointed at some unethical public db or data-collection.  1) setting your devices to signal to such networks; 2) sending your info / generating random info to send there; 

  2a. For human / minority-tracking databases:  blacklists, registering refugees, or migrants from specific regions/religions.  Consider self-registration, auto-registration of valid-looking but random identities.
 2b. Try SETI@Home style noise, where a large number of devices compute/produce small amounts of signal sent out along a given channel
3.  Social steganography? Embed real discussions among a few friends with lots of chatbots? so it’s hard to know which comments are real to find participants to trace or lean on.  [Or even change which apparent participant in a channel is the real person communicating, over time].  Possibly not helpful if subversion happens at the human level using the tapped-in comms device.
4.  Find ways to confound tracking and data-tracing.
 4a. Make mixing (or air-gap) services widely / anonymously available 
 4b.  Fake geo-tag generation. Fake GPS data from a group of users’ phones so it can’t be seen that they are all gathering together. Emit randomized (but logical) GPS coordinates when requested if turned on. ++
5.  Randomized salting of communication, to provide plausible deniability for those who pass on wrong information, and to spot-check members of a group for currently being a leak.
Ex: Encrypted group chat has pairwise encryption now.  No guarantee you get the same message as someone else in the group?  You could implement round-robin disinformation where one member of a group chat gets different info than the rest [and you could randomly select who gets bad info to see if outsiders sweep in / show up at the wrong place]
6.  Signalling: Be open about some of the above preparation, so that all parties know there are less certain returns on relying on such information.  Share how to build a system like this [specifics?] that anyone can adopt unilaterally without active coordination.
7.  Open books: imagine ways to share access to your toolchain to friends, self-surveillance to let everyone observe there is no or limited collaboration with dangerous parties.
8.  Collective multi-national insurance? to offset risks of a bubble of tyranny in one place: a pool that will help you relocate, find jobs/home in another jurisdiction…  Similarly: flesh out details of potential future costs, currently handled by the public, that might become individual costs under f – in case you have to start paying for them yourself.
   8a.  Related: collective libersurance: investing in a libertarian solution, that stops relying on government to provide those shared services (EPA protection, health insurance, &c) : leaving less on the table for a governmental shift to distort.
   8b.  Counterpoint: you might be prevented from doing this? if the government is explicitly propping up one industry (coal) over another.  Gov occupies a bunch of fields that individuals can’t use.
   
9.  Reduce reliance on your region’s infrastructure. Practice living through blackouts, emphasize taking your gadgets off-grid on a regular basis, ensuring they still work.  Ditto for plumbing.
10.  Preserve mulinational free-trade zones, black markets, networks outside of national jurisdictions, not as terribly large or strong, but with reasonable burst capacity and robust to crushing.  So that there is always a functioning side channel.  [Ex: ?? falls in Lat Am, Kowloon City]

Related ideas

1. Fix security holes in current distributed communication.
  1a.  Metadata about who’s using what network and when is still sharable;  WeChat is not very secure – even being in a channel can make you guilty and rounded up.  IPFS is great as far as it goes, but their routing mechanism still shows the node-interconnection-graph, which as with bittorrent can show who seeds/shares/acts as a hub.
  1b.  Iterated/ decentralization? needed.  A mostly-decentral system with central elements can be more vulnerable than a robustly-central system that acknowledges this as a weakness and prepares for it. 
2. Consider multinational/extranational decision-making and stakeholding, so no core stakeholder group can be entirely dominated by a central national actor
3. Keep doing this work transparently and publicly.  Increase security for discussing & updating & suggesting new ideas. 


Designing life for episodic tyranny | 1: Secure toolchains
Friday November 11th 2016, 6:00 pm
Filed under: Aasw,Blogroll,chain-gang

See also Part 2: social networks

Motivation

Classify your local environment according to how much freedom you have to create and share tools, access those of others, and communicate across secure networks.  
  • In a “Tier 1” environment you have access to all popular security technology, and can build whatever infrastructure you want, entirely within your control.  
  • In a “Tier 2” environment, central network nodes and critical infrastructure all have backdoors and logging, and noone is allowed to distribute strong cryptography that some central group is unable to break.  
  • In a “Tier 3” environment, using secure tools and all but trivial cryptography is illegal – you shouldn’t have anything to hide.  Even talking about such tools may put you on a blacklist.  A central group that enforces the law may also access, modify, or reassign your work and possessions at will.
Say you live in a Tier 1 jurisdiction, which controls land, banks, and physical infrastructure.  Periodically, it shifts for a time to a Tier 3 regime, which may make abrupt changes at any depth in society to suit the fashion of the moment.
 
While in the latter regime, you can’t always trust the law or social norms to preserve
  • Your right to communicate with others
  • Your right to use your own tools and resources
  • The visibility (to you and those around you) of how your rights and tools are changing, if these are taken away

Most infrastructure in such an environment becomes untrustworthy.  Imagine losing trust in AT&T, Google, Symantec, Cisco.  (Even if you trust the people who remain running the system, they might no longer be in full control, or may not be able to inform you if your access was altered, filtered, compromised.)  

What can you do while in a Tier 1 regime to moderate the periods where you have fewer rights?

These are some quick thoughts on the topic, from a recent discussion.  Improvements and other ideas are most welcome.

Technical design decisions to improve resilience:

1.  multi-homing, letting users choose their jurisdiction.  for instance, let you choose from a number of wholly independent services running almost the same stack, each within a different jurisdiction.
  1a.  Be able to choose who hosts your data, tools, funds.  E.g., fix current US-EU policy – give users choice of where data resides and under which laws.
  1b.  Measure: how long it takes to shift key storage / control elements betweeb jurisdictions, copying rather than mirroring any required pieces.  Make it possible to shift on the timescale of expected transition between Tiers.
 
2. Give users advance warning that the threat to their data/account is rising; make it possible to quickly change what is stored [not just what is shared with other users].
2a. Learn explicitly from how banking does this (cf. concerns among many users about funds being frozen, for less-than-fascist conflicts).
 
3. work with telcos to add built-in IP and egress-fuzzing
   3a.  consider what china does: blocking per IP, by each egress point.  harder but possible in the US.
 
4. multi-source hardware, and any other needed ‘raw materials’ at each level of abstraction
  4a.  Both multiple sources w/in a jurisdiction (for the first stages when only some producers have lost control of their own production), and in different jurisdictions.
 
5. have systems that can’t be subverted too quickly: relying on the temporary nature of the fascist trend.  (if it lasts long enough, everything mentioned here can be undone; design to make that take a reasonable amount of time and a lot of humanpower)
  5a.  add meshes – like the electrical grid, that have local robusness. When central management disappears or ‘shuts things off’, local communities can build a smaller-scale replica that uses the physical infrastructure [even if they have to go in and replace control nodes, like generators, by hand]. 
  5b.  make change happen on the lifescale of hardware that has to be replaced.  e.g. a bulk of investment in dumb pipes that have to be replaced or removed by hand.  Systems with high upfront infrastructure costs that are easy to maintain but relatively hard to replace.
 
6. design alternate solutions for each level of the stack that have minimal central requirements.  E.g. fuel-powered USB chargers, gas generators, solar panels, desktop fabs and factories.  Make it easy to produce inferior, but usable, components if the high-economy-of-scale sources dry up.
 
7. keep strong contacts with someone in the existing [government], even when there’s nothing that you need to lobby for. that makes transitions smoother, and you less likely to be surprised by change.  Cf. Idea 3: invest heavily into those social relations.
 
8. distribute end-user tools that let individuals adapt under hostile conditions.  Examples:
  8a.  Ship antennas or power sources flexible enough to be modded.  
  8b.  Allow broadcast updates to the latest version, but allow users to freeze the version at one they support.  
  8c.  Support unblockable rollbacks to earlier revisions: something like a hardware button that rollsback to one of a few previous versions, if you realize you’ve installed malware or controlware.  you can still push updates as agressively as you like, as long as the provider can hint that a new snapshot is useful as risks of overtaking increases.
  8d.  provide some sort of checksum to see if firmware has changed [even with above may be possible for new software to change that option; but users should at least know]

Related ideas

1. consider reasonable steps to degrade control:  
  1a.  starting with increased infra for those who align with government views.  (or decreased for those breaking new / stringent laws)
  1b.  compare how voting is restricted, liquidity is restricted.
 
2. consider: is it better to be asset-heavy or asset-light?  
  2a.  usefulness of land and resources to use, vs. having things that can’t be claimed / revoked. networks rather than assets – land, tools?  
  2b.  compare liquidity of favors to that of funds or items.
 
3. compare current work with regulations/regulators.  in politics, relationships w/in a commission made it valuable to have a rotating door.  Invest in those relations, considering also 2) above – invest before assets are frozen to offset risk.
 
4. compare how US corps plan for inter-state shifts within the country.  Including being flexible enough to move to a new state for favorable regs, or shift ops/people among different centers.
5. Currently there’s network-tracking of IP addresses in malls, &c.  There are tools now that have a ‘War mode’ that randomizes your MAC or other address all the time.  Injecting noise into bluetooth and other tracking is straightforward.



Bad Behavior has blocked 408 access attempts in the last 7 days.