Freedom of choice

Arseni Mourzenko
Founder and lead developer
February 17, 2015
Tags: thoughts 6

Scott Adams re­cent­ly wrote an ar­ti­cle called Would You Take Or­ders From Ma­chines?. Al­though the idea is in­ter­est­ing, my hum­ble opin­ion is that Scott is miss­ing the point.


The first prob­lem with his ar­ti­cle is the use of the term “or­ders”. We don't take or­ders from ma­chines, but mere­ly use in­for­ma­tion they pro­vide to us. An or­der is an au­thor­i­ta­tive com­mand, di­rec­tion, or in­struc­tion; thus, the no­tion in­volves sub­or­di­na­tion and obe­di­ence, which trans­lates into an un­con­di­tion­al fol­low­ing of the or­der. When a sol­dier is giv­en an or­der but doesn't fol­low it, the sol­dier is dis­obe­di­ent and his role with­in the mil­i­tary sys­tem is com­pro­mised. When I'm giv­ing an or­der to my PC to copy a file, I ex­pect it to do it. If in­stead, it copies the file to a dif­fer­ent lo­ca­tion or sim­ply re­moves the file, it would in­di­cate a prob­lem with­in the ma­chine, which may lead to its re­place­ment.

Let's talk about Scott's GPS ex­am­ple. Dri­vers don't take or­ders from their GPS. They sim­ply use in­for­ma­tion GPS pro­vide to chose the road. They may, on the oth­er hand, take in ac­count the in­for­ma­tion from oth­er sources in or­der to take a route which dif­fers from the one sug­gest­ed by a GPS. For in­stance, they may no­tice that a sug­gest­ed road looks con­gest­ed and take a de­vi­a­tion. Or they may know that a new road was tak­en yes­ter­day and take the short­cut. Or dri­ver's hus­band may call and ask to buy some gro­ceries while on the road. Or they may no­tice a nice park and de­cide to walk through it to the des­ti­na­tion in­stead of dri­ving.

With this ba­sic ex­am­ple, it's easy to no­tice two sub­stan­tial dif­fer­ences: the pos­si­bil­i­ty to have mul­ti­ple sources of in­for­ma­tion (1) and the lack of the no­tion of obe­di­ence (2).

  1. Or­ders usu­al­ly come from a sin­gle source. Imag­ine a sol­dier re­ceiv­ing an or­der to de­fend a spot from his com­man­der, then an or­der to re­tract from his oth­er com­man­der, and at the same time an or­der to at­tack from a third com­man­der. Or imag­ine my PC be­ing con­trolled by me and ten oth­er per­sons, all giv­ing con­tra­dic­to­ry or­ders. This won't work for long.

  2. What hap­pens when I de­cide to walk through the park in­stead of dri­ving? What hap­pens if I turn right in or­der to buy some gro­ceries, while my GPS told be that I should turn left?

    Not much. I won't be pun­ished by my GPS. It will not con­sid­er me dis­obe­di­ent and it will not re­place me by an­oth­er dri­ver. I am not a faulty el­e­ment of a sys­tem when my de­ci­sions con­tra­dict the sug­ges­tions of my GPS.

    It would be a dif­fer­ent sto­ry if a sol­dier starts to de­cide which or­ders should he fol­low, or my PC starts to de­cide which files should be copied.

An ex­cel­lent il­lus­tra­tion is the com­put­er­i­za­tion of the air­craft. Planes are most­ly dri­ven by ma­chines, which might look like pi­lots have no ac­tu­al choice. But look what hap­pens in a case of an emer­gency. In Flight 235, would the ma­chine over­ride the de­ci­sion of the pi­lots to turn off an en­gine? Would it pre­vent them to do it? What hap­pens is that the ma­chine steps back and lets skill­ful pi­lots take the de­ci­sions which may make a dif­fer­ence be­tween hun­dreds of deaths or a safe land­ing.

But wait—you may ask—what about Scott's ex­am­ple of stop­lights? The il­lus­tra­tion be­longs to a dif­fer­ent cat­e­go­ry, but im­pli­ca­tions are the same. Stop­lights don't give or­ders. They mere­ly pro­vide us with in­for­ma­tion. The fact that we don't run a red light is ex­plained by our so­cial be­hav­ior, by the fact that we ac­cept to fol­low the law and do what our morale tells us to do. The ma­chine as­pect of the stop­light here is ir­rel­e­vant, it is sim­ply an avatar; what is rel­e­vant is the law and the morale. I won't mur­der my neigh­bor (even if he makes a lot of noise when I try to con­cen­trate on my ar­ti­cle) be­cause it's against the law and against my morale. Sim­i­lar­ly, I stop at red light, guid­ed once again by the law and my morale.

If the no­tion of avatar is un­clear, let me il­lus­trate this with the ex­am­ple of a sol­dier. When a sol­dier re­ceives an or­der through his ra­dio, is he obey­ing to the ra­dio or to his com­man­der? Right, the ra­dio is only a tool. Imag­ine now that in­stead of talk­ing through the ra­dio, com­man­der may ac­ti­vate re­mote­ly one of two lights on a small de­vice of a sol­dier. Green light means “Fire at will”, blue light means “Stay low and move to ex­trac­tion zone”. Is the sol­dier fol­low­ing the or­ders of a small de­vice or his com­man­der?

While stop­lights are not ac­ti­vat­ed man­u­al­ly by a po­lice­man, the idea is the same. We, peo­ple, made those stop­lights, we de­fined the rules which make them turn red or green. A stop­light doesn't “de­cide” that your car should stop. A stop­light just pow­ers dif­fer­ent lamps ac­cord­ing to some ba­sic rules. It doesn't un­der­stand those rules, and doesn't un­der­stand the mean­ing of the red or the green light, nor its con­se­quences on the world.

Free­dom of choice

The sec­ond prob­lem is the no­tion of choice and free­dom of choice. Ac­cord­ing to Scott, “My too-clever point is that some­day hu­mans will be en­slaved by their ma­chines with­out re­al­iz­ing it.”

What is free­dom of choice? Do we ac­tu­al­ly have free­dom of choice?

With­out go­ing too philo­soph­i­cal, it is clear that near­ly every choice we make is some­how con­strained by our en­vi­ron­ment.

When I buy gro­ceries, am I com­plete­ly free in my choice? Not re­al­ly: for starters, I can buy only what is avail­able in the store. I can chose whether I want or­anges or ap­ples, un­less there are no or­anges left, or the own­er of the shop de­cides to dou­ble the price of ap­ples.

When I go walk­ing, am I free to go wher­ev­er I want? Not re­al­ly: there is pri­vate prop­er­ty, and there are dis­tricts of the city which are not safe, and there are plen­ty of oth­er fac­tors which in­flu­ence my de­ci­sion.

It still looks like I have free­dom of choice. I can buy or­anges even if every gro­cery store in my city de­cides to sell ap­ples only: I can still dri­ve to an­oth­er city. And I can walk in any dis­trict of the city, at my own risk. But the fact re­mains that the con­se­quences of our choic­es make us free only in ap­pear­ance.

Ma­chines don't take from us free­dom of choice. Their pres­ence is sim­ply ir­rel­e­vant. They be­come an­oth­er fac­tor which may in­flu­ence my choic­es, but this doesn't mean that I'm more con­strained than I am with­out them. The nice part is that since they don't give or­ders, but mere­ly sug­gest and pro­vide in­for­ma­tion, I al­ways have a choice to con­sid­er their sug­ges­tions as rel­e­vant or to ig­nore them.

Imag­ine a ba­sic case. I'm about to go to a restau­rant with my friends. Am I free to chose one in the first place? Prob­a­bly not, be­cause some are closed, some are too far away and some don't serve veg­e­tar­i­an food (and one of my friends is veg­e­tar­i­an). More­over, an­oth­er friend may hate my fa­vorite restau­rant, so my choice can be over­ruled be­cause those fuck­ing so­cial rules force me to con­sid­er the opin­ions of my friends for I don't know what rea­son!

Now, I can ask my smart­phone for sug­ges­tions. Let's see... hm, there is a restau­rant I don't know which is not far away, which serves veg­e­tar­i­an food, which can take my reser­va­tion on­line right now and which has a good rat­ing. Let's go there!

Who took the de­ci­sion now?

This means that the fact that om­nipresent ma­chines in­flu­ence our choic­es has no real im­pli­ca­tions. Yes, we use those ma­chines in or­der to take de­ci­sions, but this doesn't make us slaves, nor do we ac­tu­al­ly take or­ders from ma­chines. We are free, like we were free be­fore we bought a GPS, a smart­phone and dozens of oth­er gad­gets in­tend­ed to make our life eas­i­er by en­slav­ing us.