Attorney General Matthew Whitaker is consulting with ethics officials regarding possible recusal from overseeing the special counsel’s Russia investigation, the Justice Department said on Monday.
“Acting Attorney General Matt Whitaker is fully committed to following all appropriate processes and procedures at the Department of Justice, including consulting with senior ethics officials on his oversight responsibilities and matters that may warrant recusal,” Kerri Kupec, a department spokeswoman, said in a statement on Monday night.
He’s been boiling for months about Sessions’ recusal from the Russia investigation, and now I would guess there is a better-than-average chance Whitaker is going to recuse as well.
Their cunning plan to make this all go away is continuing to blow up in their faces, and it will keep on blowing up in their faces because Trump and his circle of goons and lickspittles are a bunch of cheap grifters.
According to the Wall Street Journal, some Google services were “temporarily unreachable for some users after some traffic intended to reach the web giant was rerouted through other networks,” though the company has not publicly disclosed whether it has determined the issue was a technical error or a hacking attempt. The AP reported, however, that the re-routing may have been the result of a border gateway protocol hijacking attack—in which an internet hub responsible for directing global internet traffic lanes is compromised to send that traffic to the wrong destinations.
On a side-note, I have been listening to David E. Sanger’s The Perfect Weapon in the car, and it is quite good, albeit rather sobering.
This weekend my hotel room had this very weird little booth space right against the TV that turned out to be perfect for writing.
I rarely work in coffee shops, but I love sitting in diner booths — there’s something about being tucked in on 3 sides that makes me really want to just sit and work.
I’m not the only one.
I can’t say as I have ever any much time actually working from a diner booth, but now I want to give it a try.
The biggest and most frightening impact of the AI revolution might be on the relative efficiency of democracies and dictatorships. Historically, autocracies have faced crippling handicaps in regard to innovation and economic growth. In the late 20th century, democracies usually outperformed dictatorships, because they were far better at processing information. We tend to think about the conflict between democracy and dictatorship as a conflict between two different ethical systems, but it is actually a conflict between two different data-processing systems. Democracy distributes the power to process information and make decisions among many people and institutions, whereas dictatorship concentrates information and power in one place. Given 20th-century technology, it was inefficient to concentrate too much information and power in one place. Nobody had the ability to process all available information fast enough and make the right decisions. This is one reason the Soviet Union made far worse decisions than the United States, and why the Soviet economy lagged far behind the American economy.
However, artificial intelligence may soon swing the pendulum in the opposite direction. AI makes it possible to process enormous amounts of information centrally. In fact, it might make centralized systems far more efficient than diffuse systems, because machine learning works better when the machine has more information to analyze. If you disregard all privacy concerns and concentrate all the information relating to a billion people in one database, you’ll wind up with much better algorithms than if you respect individual privacy and have in your database only partial information on a million people. An authoritarian government that orders all its citizens to have their DNA sequenced and to share their medical data with some central authority would gain an immense advantage in genetics and medical research over societies in which medical data are strictly private. The main handicap of authoritarian regimes in the 20th century—the desire to concentrate all information and power in one place—may become their decisive advantage in the 21st century.
If we have learned nothing else over the last two years (and I hope we have!), it is that systems inherit the tendencies and biases of the organizations that design and build them. Algorithms are not neutral.
I think Harari is right to be wary of AI and of technology in general. However, I think this particular worry is overstated.
Speaking more broadly, it is not technological dystopia I fear, so much as technological _crap_topia. I don’t worry too much about authoritarian dictatorship using AI-powered tech to monitor and manage every aspects of our lives. I think we are more likely to suffer an ever-progressing trend of shitty marketing—the sci-fi future equivalent of a search-query typo that causes the same stupid banner ad for a product you’re not interested in to follow you around for weeks at a time.
My hospital had, over the years, computerized many records and processes, but the new system would give us one platform for doing almost everything health professionals needed—recording and communicating our medical observations, sending prescriptions to a patient’s pharmacy, ordering tests and scans, viewing results, scheduling surgery, sending insurance bills. With Epic, paper lab-order slips, vital-signs charts, and hospital-ward records would disappear. We’d be greener, faster, better.
But three years later I’ve come to feel that a system that promised to increase my mastery over my work has, instead, increased my work’s mastery over me. I’m not the only one. A 2016 study found that physicians spent about two hours doing computer work for every hour spent face to face with a patient—whatever the brand of medical software. In the examination room, physicians devoted half of their patient time facing the screen to do electronic tasks. And these tasks were spilling over after hours. The University of Wisconsin found that the average workday for its family physicians had grown to eleven and a half hours. The result has been epidemic levels of burnout among clinicians. Forty per cent screen positive for depression, and seven per cent report suicidal thinking—almost double the rate of the general working population.
Something’s gone terribly wrong. Doctors are among the most technology-avid people in society; computerization has simplified tasks in many industries. Yet somehow we’ve reached a point where people in the medical profession actively, viscerally, volubly hate their computers.
Later in the piece, the author talks about how, while the software makes healthcare providers’ jobs more difficult, it has yielded better results for patients.
It is hard to argue with that outcome, but I also wonder if this isn’t another case where, in optimizing for one part of a system, we haven’t made the whole thing unsustainable over time.