CASE Act – The Hill Internet Regulation Dies On

Privacy has been obliterated, Russians Image result for copyright lockhack our elections, Facebook buys information about women’s periods, Internet Service Providers (ISPs) want to charge more to access certain websites, culture industries want to restrict access to content, the Internet has become a giant mall, free speech has been limited, algorithms feed us the worst of humanity. What does Congress want to do regulate the Internet? Make the Internet more punitive for people that share cultural content.

Today, the US House of Representatives passed the The Copyright Alternative in Small-Claims Enforcement (CASE) Act 410-6. Whenever something passes Congress close to unanimously with little debate, it must be about copyright. As I discussed in iTake-Over, Congress allows copyright “stakeholders” to negotiate copyright law and hurriedly pass it. This is the quintessential example of this process described by Jessica Litman.

Witness the sponsor of the bill’s language. Rep. Hakeem Jeffries (D-NY) states: “The internet has provided many benefits to society. It is a wonderful thing, but it cannot be allowed to function as if it is the Wild West with absolutely no rules. We have seen that there are bad actors throughout society and the world who take advantage of the internet as a platform in a variety of ways. We cannot allow it.”

Yes! We cannot allow internet companies to operate in a Wild West of unregulated capitalism. I couldn’t agree more! But the solution? Create a small claims tribunal that ignores due process and will allow cultural industries to issue take-down notices. This legislation will do nothing to regulate the Wild West and everything to shutdown free speech further. The CASE Act will be the hill that internet regulation dies on.

Fast Capitalism 16.2

I am the editor of Fast Capitalism. Fast Capitalism is a peer reviewed journal that publishes “essays about the impact of rapid information and communication technologies on self, society and culture in the 21st century.” Our latest issue, 16.2, includes 3 essays that closely align with The Dialectic of Digital Culture. This includes Co-editor Jennifer Miller‘s Essay “Bound to Capitalism: The Pursuit of Profit and Pleasure in Digital Pornography.”


Jennifer Miller

This paper explores digital pornography dialectically through a case study of, an online BDSM subscription service. It considers the tension between discourses of authenticity that seek to obscure profit motive and shifts in content prompted by market considerations.

Mario Khreiche

The future of work has come under renewed scrutiny amidst growing concerns about automation threatening widespread joblessness and precarity. While some researchers rush to declare new machine ages and industrial revolutions, others proceed with business as usual, suggesting that specialized job training and prudent reform will sufficiently equip workers for future employment. Among the points of contention are the scope and rate whereby human labor will be replaced by machines. Inflated predictions in this regard not only entice certified technologists and neoclassical economists, but also increasingly sway leftist commentators who echo the experts’ cases for ramping up the proliferation of network technologies and accelerating the rate of automation in anticipation of a postcapitalist society. In this essay, however, I caution that under the current cultural dictate of relentless self-optimization, ubiquitous economic imperatives to liquidate personal assets, and nearly unbridled corporate ownership of key infrastructures in communication, mobility, and, importantly, labor itself, an unchecked project of automation is both ill-conceived and ill fated. Instead, the task at hand is to provide a more detailed account on the nexus of work, automation, and futurizing, to formulate a challenge to the dominance of techno-utopian narratives and intervene in programs that too readily endorse the premises and promises of fully automated futures.

Sascha Engel

This paper discusses an underrepresented dimension of contemporary alienation: that of machines, and particularly of computing machines. The term ‘machine’ is understood here in the broadest sense, spanning anything from agricultural harvesters to cars and planes. Likewise, ‘computing machine’ is understood broadly, from homeostatic machines, such as thermostats, to algorithmic universal machines, such as smartphones. I suggest that a form of alienation manifests in the functionalist use and description of machines in general; that is, in descriptions of machines as mere tools or testaments to human ingenuity. Such descriptions ignore the real and often capricious existence of machines as everyday material entities. To restore this dimension, I first suggest an analytics of alienating machines – machines contributing to human alienation – and then an analytics of alienated machines – machinic alienation in its own right. From the latter, I derive some possible approaches for reducing machinic alienation. I conclude with some thoughts on its benefits in the context of so-called ‘Artificial Intelligence’.

Amazon: eyes and ears everywhere . . . telling the police

Image result for amazon surveillanceAs if Amazon needed more promotion, CNET, Washington Post, The Verge, Popular Mechanics, Business Insider, NBC, and many more, covered the event like it is actual news (don’t forget Facebook is doing the same). These tech company events have become part of our culture, but the reporting ignores the deep implications of these technologies. In fact, Amazon outdid itself this time around with its insistence on invading our privacy.

Image result for surveillance societyMost importantly, Amazon created a pair of glasses that have Alexa embedded in them. Because things went brilliantly with Google Glass, Amazon decided, hey, let me get in on that action of resistance to new technologies. With its “Echo Frames”, Amazon will be able to record everything that users see. That includes all of the people out there who do not want to wear Echo Frames, and there’s nothing we can do about it–except declare they’re not allowed in certain places.

What does Amazon want to do with this? Sell things to you at every turn. Your world with Echo Glasses will be a walking advertisement. You see something and an alert pops up to buy it. Alexa will announce it to you in the new Echo Ear Buds. And talk to you through a ring called Echo Loop. You’ll be tapped into all the ads you could ever dream of . . .

Image result for amazon surveillanceBut that’s not all. Don’t forget that Amazon owns the Ring Doorbell. Ring Doorbell and the Neighbors App have deals with police departments across the United States to “share” information from Ring on request. So they will sell this to the police. We also know that Amazon has given Echo data to police in certain circumstances.

Image result for amazon surveillanceMy guess is they’ll also force their workers to wear Echo Glasses to monitor them on the job.

We need to be wary of these new technologies. Ask the tough question about why tech companies want to sell them, and think of the implications. As I mentioned in a previous post, newspapers are only concerned (especially the Washington Post) in the most banal ways.

Washington Post questions Amazon in the most banal way

For the first time, an article in The Washington Post becomes Image result for amazon alexareflective about an Amazon plan. The article “Amazon starts crowdsourcing Alexa responses from the public. What could possibly go wrong?” The newspaper owned by Jeff Bezos, CEO and Founder of Amazon, decides to ask what could go wrong? about the most banal of Amazon plans.

Amazon now allows users to update Alexa’s responses. You can contribute, and through big data, they will pick the most frequent answers. This is really no different than relying on search results. But the Washington Post finally wonders, what could go wrong?

The answer is relatively harmless compared to other things that Amazon does that the newspaper celebrates. Amazon wants to deliver using drones, Alexa listens to everything you do, Ring Doorbell partners with police departments, Amazon automates its distribution centers, Alexa adds cameras–what could go wrong? A lot.

We see Amazon continually encroaching on our privacy – both those who consent and those who do not – exploiting workers, downsizing the workforce, etc. To these issues, the Washington Post is silent. When the national paper of record decides to question the societal impact of an Amazon decision, it is about the most trivial of problematic things that Amazon does.

Facebook, Dating Apps, and Period Trackers

When I learned about Facebook’s decision to launch a dating service, I was not shocked. After all, Facebook began as a quasi dating service. Zuckerberg started the whole thing with the “relationship status” information. However, by making an explicit dating service, Facebook makes one thing clear: they want all your information. If you use the dating service, not only does Facebook have access to all of your typical data, but they can add your likes/dislikes in other people.

But today, something crazier came out. “UK-based advocacy group Privacy International, sharing its findings exclusively with BuzzFeed News, discovered period-tracking apps including MIA Fem and Maya sent women’s use of contraception, the timings of their monthly periods, symptoms like swelling and cramps, and more, directly to Facebook.” That’s right Facebook bought (they always use the term “share”) information about women’s sex lives and periods.

The amount of information that Facebook retains on its users is obscene. And Facebook’s use is not regulated in any way.

Tracking Privacy’s Loss

When I started driving, my mom would make me take her cellphone me when I left on my own. I lived in the country and drove back roads. My mom was scared that my car would breakdown and I would be stranded for hours. Sure enough my car did breakdown, and I was able to call for help. However, she otherwise forbade me from making phone calls (so few minutes to spare), furthermore, phones were banned from schools. The fear was that something bad could happen and cellphones would save me., not only do new drivers driving in the country have cellphones, now little kids have GPS trackers that announce their every move. Under Armour even makes kid’s shoes with GPS so you can track where they are and make sure they’re not being couch potatoes.

There is an ugly slippery slope here. First, kids take a cellphone just in case something happens. Second, parents say call me when you get to X. If a parent doesn’t receive the contact, full freak-out mode. Third, the call turned to text. Fourth, some parents realized they could use the phone tracking features on iPhones (originally for lost/stolen phones) to see where there kids were at any given moment. Fifth, apps developed specifically for the purpose of tracking children. Next, these apps became ever-more invasive by sending all texts, calls, emails, web searches, pictures, to their parents. Predictably, kids found ways around this. They turned to Snapchat to send fleeting messages that disappeared form view. They turned off their phones to stop the GPS tracking. They bought burner phones! Finally, parents turned to GPS trackers.

Recently, a study came out that showed this information was freely available online. Part of the problem was the company’s default password. But the password is only part of the problem—the other part is our willingness to give up privacy for a perceived good. This is privacy lost through the voluntary loss of privacy. When we track kids, everyone can track kids, this is not surprising.

My question is, when does tracking kids stop? So you begin tracking your kid for whatever fear as a parent you may have. But when do you stop? You might tell yourself, I’ll stop monitoring texts when she’s 16 and location at 18, but will you? There are certainly legal questions once someone reaches 18, but parents have ways of exercising control over their kids.

Furthermore, the constant tracking changes the tracked. Kids grow up without an expectation of privacy. If their parents can see everything, they change their behaviors and imagine constant surveillance. Then when companies or the state surveil them, they are unsurprised. Why would things be any different? This has massive implications on privacy as a public ideal in the future. In Brian Connor and Long Doan’s chapter “Government vs. Corporate Surveillance: Privacy Concerns in the Digital World,” they wrestle with the distinction between corporate and government surveillance. But this seems to be a new type of surveillance that we should watch: familial surveillance.

In a conversation with students, I learned of something even creepier (to me). People now track their significant others! The apps developed to track kids are now used by people to track their boy/girlfriend/partner/spouse. The students were incredulous: why wouldn’t you want to know where your significant other is? If you can’t track them, you can’t trust them because they MUST be up to no good.

These are systemic issues that we need to explore on the public policy level. It’s not a question of whether or not you opt-in, but rather that opting-out is no longer an option.

Google’s Employees Can No Longer Identify Evil

Google Code of Conduct

Image result for googleGoogle’s first motto was “Don’t Be Evil.” This seemed like a fitting title for any tech company that aims to be a force of good (which who doesn’t want to be a force of good). Of course, they removed the motto from the website in 2018. After years of criticism for doing evil things, did this mean they no longer held this to be their motto? Yes, and in 2019 it is quite clear that the opposite is true.

A recent stoImage result for don't be evilry reports that Google will no longer allow employees to discuss anything but work. Specifically, Google is afraid of being attacked by right-wing ideologues for seeming “liberal” positions. One of those “liberal” positions, I assume is to not be evil. As a result, workers can no can only talk about “facts.” But what is a fact? What types of facts can workers discuss? Who determines which facts are factual? This is especially important in a place highly criticized for its enabling of a post-truth society. How does Google address moderating factual information?

I write this in the context of their motto “don’ be evil.” While it is no longer their motto, it is still part of their code of conduct (though an afterthought).

Google Code of ConductDetermining whether something is evil requires discussion. People need to be able to speak openly and honestly to ensure that they are doing the right thing. In fact, their code of conduct tells employees “don’t be afraid to ask questions of your manager, Legal or Ethics & Compliance.” However, their new rules seem to undo the Code of Conduct.

Employees at Google must be able to discuss more than “facts” in order to make the Internet (which Google is one of the largest monopolies of information on the web) a better place. Now is the time to regulate Google.

Won’t you be my (digital) neighbor?

Image result for neighbors appA white 30-something male walks through a park at night, and reaches in his pocket to pullout his cellphone. An Asian woman walks down a street at night, and takes her smartphone from her purse. A white woman stops in her kitchen to attend to an app on her smartphone. We get the first glimpse of what catches their attention on their phones, a long-haired white guy wearing a hoodie is at someone’s front door. Another white male wakes up in his bed, wearing pajamas, to view the same video. We see that the Asian woman from earlier sees the same video. We learn for the first time that there is text that goes along with the video:

This guy tried to break into my house!

This guy came around the side of my house trying to break in! He ran when I set off my alarm but he may still be in the area

Neighbor 13: I’ve seen this guy before

Neighbor 2: Saw him at my house earlier

Finally, we see a suburban neighborhood with large beautiful homes and pools from the sky. Several dozen homes have a blue halo around them signifying that they use this product. The commercial is for the new app for the Ring doorbell camera. The app, called “Neighbors,” bills itself as the “new neighborhood watch.” The spokesperson says, “At Ring, we want to keep neighborhoods safer, by keeping you informed.”[1]

This advertisement reminds me of my first week in my home in a nice safe neighborhood. My wife and I were walking our 6-month-old son and dog around the neighborhood. An eccentric neighbor began speaking with us, and informed us that her and her husband recently saw a car that did not belong in the neighborhood, and that we need to watch our vehicles because this suspect was probably breaking into caImage result for neighborhood watchrs. While we had no further concrete information about the supposed perpetrator, we had an immediate uneasiness about our new neighborhood (over time it has become an uneasiness about that particular neighbor). Maybe the idea behind this app is that I can have further peace of mind that we can come together to chase away potential car thieves or maybe we turn every person whose appearance we do not like into a potential thief. The point of The Dialectic of Digital Culture was to emphasize that the intention of the developers of these technologies is often the latter. In fact, police agencies developed Neighborhood Watch in the 1970s based upon a perceived increase of threats in neighborhoods. However, the perceived threat during the 1970s was always-already racialized, and not based upon actual increases in crime.[2]

Digital technology infiltrated neighborhoods with community boards long ago. My neighborhood uses the popular “Nextdoor” website. If you feel paranoid, or voyeuristic, you can log on this website and see all kinds of moral panics around perceived folk devils in your community.[3] In one post in my neighborhood, a woman taking a dirty beat-up discarded rug becomes a thief shamed and immortalized on the Internet. WeImage result for nextdoor app can have disconnected conversations with neighbors we do not know about events we do not understand. But we can all pull together and recognize one thing: we are not safe. Neighbors allows us to use digital technology not only to help alert our neighbors of potential threats, but to raise everything to a potential threat. Already, people are developing lists of best practices with your doorbell camera and police departments started programs to tap into these cameras. Whereas Ring professes they designed the Neighbors App to bring together people in a neighborhood, it will recoil us further into our own little webs as we become scared to show-up on a neighbor’s door unannounced for fear of being painted as an intruder or mocked for what we wear.

Furthermore, Ring (and Amazon) partnered with police departments to utilize 911 data to alert people using the “Neighbors” app when their neighbors contacted emergency services. You may be busy changing a diaper, but you can receive an update that your crazy neighbor called 911 because someone let their grass grow too high. Or someone sees a “suspicious” vehicle and alerts the police. Instead of just hearing about your voyeuristic neighbors on the App, you can now hear about every report of any type of incident straight to your phone.

But Ring goes even further with a relationship to the police. Amazon also provides over 225 police departments with access to video footage from Ring (EDIT: The Washington Post is now reporting over 400!). That means that police departments can see what is going on at your front door. Some police even tout it as a service to the community. You think you have a zone of privacy in at your home because you don’t have a Ring doorbell? Only if your neighbors don’t have one either. If someone else’s doorbell faces your home, police can see your home too! Even worse, they can see in your windows if the camera is pointing at them.

While the commercial for Neighbors app demonstrates one creepy level of the digital dialectic, Ring (and its owner Amazon) tries to outdo itself through targeted marketing on social media. The commercial discussed had thinly veiled racial anxieties behind, but Ring goes further. On the morning of February 8, 2019, I witnessed the full extent of the Ring Neighbor App’s pushing racial anxieties when a Ring sponsored post from September 28, 2018 appeared in my Instagram feed. In this ad, four African American men are displayed through the Ring Neighbor App taking packages and breaking into homes in “Culvert City.” The ad ends with “Wonder what’s going on in your neighborhood? Download the free Neighbors App.” Since the App is free, you do not even need the Ring doorbell to participate in your neighborhood’s surveillance. However, a second level of racialized myth emerges from this Instagram advertisement the surveillance targets people from their computer searches. Since I did research on my various devices about Ring and the Neighbors App, Ring targeted me as a potential customer because companies track my every move on the Internet across platforms. This is especially problematic because of the ubiquity of Amazon; the different platforms that I log into with Amazon track me in the background using cookies. For instance, I read The Washington Post (owned by Jeff Bezos, Amazon’s founder and CEO) on my tablet each morning, which I log in using my Amazon account; often, articles about surveillance draw my attention for my research. Amazon uses this information to target me further. While the first ad featured racist conceptions of vulnerable populations, the target of the ad was obscured, but objectified through white privilege. However, in the second ad, Ring recognized me as a white guy and fed into hegemonic white fears of black bodies. These individualized ads target us based on demographic information. Ring and Amazon proudly stoke racist anxieties for profit.

If you’ve ever been around someone with a Ring doorbell (or you own one yourself), you know they create a minor annoyance. Every time a bee flies by the camera, it sends an alert that there is someone at the door. While annoying, it is the least of the problems. Amazon and Ring are developing a surveillance state that knows everything about you and targets your fears and vulnerabilities in order to increase surveillance. We embed these technologies in our everyday lives, but their pervasiveness needs to be analyzed and not accepted.

[1] Neighbors App: The New Neighborhood Watch:

[2] Hall et al., Policing the Crisis: Mugging, the State, and Law and Order.

[3] Cohen, Folk Devils and Moral Panics: The Creation of the Mods and Rockers.

The Dialectic of Digital Culture Published

While The Dialectic of Digital Culture was supposed to be published in mid-September, the team at Lexington Press is incredibly efficient and released the book early!

Jennifer Miller and I spent the weekend in New York City attending the American Sociological Association Annual Conference. When we arrived, our fabulous editor Courtney Morales had her copy in print. Then we started receiving emails and texts from contributors that in fact the book arrived. Of course, since we were away from home, we didn’t receive our copies until Wednesday.

The book would not have been possible without the support of the University of Texas at Arlington’s College of Liberal Arts, Department of Sociology and Anthropology, Department of English, and the Center for Theory. We’re also grateful for all of our wonderful contributors.

The books look great and I couldn’t be happier with the result. Thank you to Courtney, Shelby Russell, and the rest of the team at Lexington Books. Order your copy now.

Here are some pictures from the American Sociological Association conference.


David Arditi holding The Dialectic of Digital Culture for the first time.



Jennifer Miller

Jennifer Miller presenting her Chapter from The Dialectic of Digital Culture



Brian Connor presenting his Chapter from The Dialectic of Digital Culture at ASA.

Dialectic of Digital Culture at the American Sociological Association Conference

Several of our contributors will be presenting their research at the American Sociological Association Annual Conference in New York City. Both Editors will be at the conference presenting their research.

On the Critical Theory I: Dialectical Engagements, Jennifer Miller will be presenting her chapter from the entitled “Queer Ends: Digital Culture, Queer Youth, and Heterosexuality Beyond Heteronormativity.” Also on that panel, contributors Brian Connor and Long Doan will be presenting “Government vs. Corporate Surveillance: Privacy Concerns in the Digital World.” We hope that you join them to receive a glimpse of the work in the book.

In a similar vein to The Dialectic of Digital Culture, I will be the discussant for Digital and Social Media: Perceptions, Uses and Impact. The papers on this panel present a unique opportunity to think about the digital dialectic in more areas. I’m also presenting my music industry research on the Open Topic on Marxist Sociology. My paper “Copyright as Enclosure: State, Capital, and Primitive Accumulation” explores the way copyright is used to exploit musicians.

Contributor Nancy Weiss Hanrahan will be at the ASA as well serving as a discussant for “Positionality and the Construction of Feminist Theory.”