Monday, May 21, 2018

Ethical Journalism in an Age of Mass Murder

For a long time, there has been strong (overwhelming?) evidence that the media has influence over the number of people who commit suicide.  Called the "copycat effect" or "media contagion," it's basically the idea that when when the media reports on suicide, they influence more people to kill themselves.

"Research into suicide coverage worldwide by journalism ethics charity MediaWise found clear evidence that the attention given to the circumstances surrounding a celebrities who kill themselves is more likely to incite copy cat suicides."

For this reason, the media has best practices for suicide reporting: don't even cover suicides unless it's a noteworthy person, don't glamorize or romanticize it, etc.  This dedication to language best practice is fairly sophisticated - for example, the Associated Press even recently recommended against using the phrase "committed suicide."

---

Three years ago, Malcolm Gladwell published an article that posited a similarly intuitive (even obvious) theory on mass shootings.  I'll just quote his main point here:

"But Granovetter thought it was a mistake to focus on the decision-making processes of each rioter in isolation. In his view, a riot was not a collection of individuals, each of whom arrived independently at the decision to break windows. A riot was a social process, in which people did things in reaction to and in combination with those around them. Social processes are driven by our thresholds—which he defined as the number of people who need to be doing some activity before we agree to join them. In the elegant theoretical model Granovetter proposed, riots were started by people with a threshold of zero—instigators willing to throw a rock through a window at the slightest provocation. Then comes the person who will throw a rock if someone else goes first. He has a threshold of one. Next in is the person with the threshold of two. His qualms are overcome when he sees the instigator and the instigator’s accomplice. Next to him is someone with a threshold of three, who would never break windows and loot stores unless there were three people right in front of him who were already doing that—and so on up to the hundredth person, a righteous upstanding citizen who nonetheless could set his beliefs aside and grab a camera from the broken window of the electronics store if everyone around him was grabbing cameras from the electronics store."

The media's endless coverage of every mass murder is driving copycats...and no one is doing anything about it.  It's not that journalists individually know this and are OK with it - they're just trapped in a system that is designed to drive clicks and views, and endless coverage of mass murder is a profitable way to do that.  A better summary of this situation is made here.

We now have a stack of voices naming this out loud in the Washington Post, Federalist, Criminologists, Ethical Journalism Network, etc.

---

So, what to do?  There are many great, thoughtful proposals out there - here's one from the Columbia Journalism Review.  The gist is that we can still responsibly cover mass murder - driving awareness, resources, policy change, prevention, and free flow of information in our democracy - but limit the media contagion.  We can do this by not printing the person's name, picture, manifestos/ravings/message, or comparing kill counts.  Phrases like "deadliest shooting spree" or "gunman" create a morbid romanticism, even a gamification in a dark mind.

Another proposal is to call on the media to de-monetize coverage of mass murders.  Selling ads by spreading media contagion is a bit like selling soup prepared by Typhoid Mary.

We need a website written by respected authorities in journalism laying out these proposals. We need politicians to use their voice to raise the issue, we need grassroots boycotts for advertisers who buy ads on media that refuse to report responsibly.

Our journalists generally feel their work is a vocation, not just a job.  They're proud of the role they play in the nation's well-being and advancement, and I'm sure it's horrifying for a person to realize they're part of this morbid feedback loop - more murders, more coverage, more murders.  Just conjecture here, perhaps part of the reason journalists are so ardent in their support of gun control as a solution to mass murder is that they're aware of their role, and are looking for a scapegoat to restore their feeling of "the good guy." 




Monday, May 7, 2018

Poverty and Geography in Minneapolis


It's an open secret that concentrated poverty is at record levels and getting worse.  This has been occurring in tandem (it's a feedback loop) with a new structural unemployment that have stayed at bleak 40-year highs since 2012.

The short story is that even as our economy has improved and Americans in general have gotten wealthier, the bottom 20% or so have been left behind.  You can see that from 2000 to 2018, 5% of workers dropped off the face of the Earth. This is awful. 

Concentrated poverty is a big contributor to this - clustering poor people together means, as Ed Sheeran's song says, "the worst things in life come free to us."  Poor communities have higher crime, substance addiction, worse public services, less social capital, less opportunity, worse education, basically a basket of awful variables that form a Feedback Loop of Awful (let's call it FLA).  

This blog post is about geographic isolation - one of the variables in the FLA.  Of course, access to the rest of the city is valuable, so the cheapest housing is the least accessible.  I've now lived in the poorest, most violent, and highest minority part of Minneapolis for a year, and a few things have become empirically obvious to me. 

Take a look at this map.  To the bottom left you have the richest suburbs with the corporate jobs.  To the bottom right you have the airport.  To the right of the S in Minneapolis you have the U of MN, and to the left of the M in Minneapolis you have "the hood," North Minneapolis. 

Here's a closer look at the city and North:

Above the words "Near North," and to the west of 94,  is where the hood begins.  We affectionately refer to it as the "North."  It takes up the entire area to the northwest.  A few local knowledge things to note:
  • To the east of the river there are tons of resources, amenities, culture.  The North is separated by both a 10-lane highway and the Mississippi River. 
  • There is a train that goes from downtown to the airport.  It never reaches North.
  • 94, between the words "North Loop" and the junction with 35W, is forever deadlocked.  This short stretch of highway adds 15 minutes to your trip, every time.  This means any trip from the North to anywhere south or east is at least a half hour - cutting North off from the south and east of the state.  This is not true of land east of the river, where 35W runs north and south smoothly. 
  • There is a stretch of no-man's land between 94 and the Mississippi that is in hospitable.  It's industrial and ugly.  It's basically a DMZ to separate the rich and poor.
  • To get from the closest part of North to downtown is eminently unwalkable.  First you have to cross an intimidating, rusty concrete bridge (take a look below...yikes) across 10 lanes of I-94, and then 7 blocks of nasty, noisy, windswept industrial buildings before you reach downtown.  Again, a DMZ to separate the rich and poor.
What all of this combines to is isolation - concentration of poverty.  Some suggestions:
  1. Beautify the overpass bridge and the trip to downtown.  This would be cheap and easy...protect the pedestrians from the wind and noise of the overpass, repair the sidewalks, plant trees, set lighting, and incent those who own the industrial buildings to slap on a new coat of paint every once in awhile.
  2. Finish the train track across, into North.
  3. Incent walkable business and retail in the no-man's land. 
  4. Improve and expand local streets with a north/south traverse in mind. 
  5. Figure out some way to improve the deadlock on 94!



Sunday, February 18, 2018

R Syntax Explained

Aggregate: This is used to apply a function (like mean) across a data set that is subset according to your needs.  For example, if you have a table of car sales details and prices (called mydata) and you want to know average sale prices, you can't just average the entire price column: you need the average price for each type of car.  Let's say your table has these columns: make, model, and price.

Aggregate takes a few inputs.  The first item is the dataset you care about: in this case, the table mydata, but specifically the price column.  Second item is a list of what subsets you'd like to create.  For example, we want to subset every row that matches "Ford" and "F150" and average their price.  So our second item is what categories we want to break the data out into: in this case, we want to see every unique combination of make and model.  The last item is the function we want to apply to the subset: average, median, etc.

result <- aggregate="" by="list(mydata$MAKE," mean="" mydata="" p="">

Filter and Select:  One of my favorite combinations. 
Filter takes two inputs: your dataset, and how you'd like to subset it.  So first input is our table mydata, easy enough.  Second input is a test: we give it the column Make, and test the values to see if they equal (==) Ford.  If the row's Make column contains Ford, filter will keep that row.  Otherwise, it's tossed. 

Select then is being given the result from filter.  Filter has snagged every row from our original table (mydata) and includes every column.  In other words, mydata started with columns make, model, and price, and filter also has all those columns. 

Select takes two inputs: one is your complete data set, the other is the column(s) you want to keep.  In this case, we want to keep the price column.  So this command eliminates every row that isn't a Ford sale and gives you a 1-column table of the prices of those Fords. 

In other words, the command below answers the question "give me just the prices from every Ford sale in the table."

select(filter(mydata, Make=='Ford'), Price)


Native dataframe manipulation: Sometimes you don't need to use commands like filter or aggregate to get the subset of data you want.  Let's say you have a 1-column table (let's call it car_returns) of the prices of all the cars that were brought back from a customer and had to be refunded.  How would you identify the make and model of the cars that were returned, just from the price?   So let's say the question is "give me all the rows (including make, model, and price) from my original table (mydata) that match these prices."

In general, you can subset a dataframe with a [] after the name: mydata[].  Inside the bracket, we'll need to pass two pieces of information: first, what column in mydata will correspond to the values in car_returns?  Obviously, price.  Now, we aren't comparing mydata's price column to a single price: we need to compare it to all prices that are in the car_returns table.  So we will use %in% to say "we want all the rows from mydata where Price equals one of the values from car_returns." 

mydata[mydata$PRICE %in% car_returns, ]

The other thing you'll notice is the comma after car_returns.  What's going on there is that we are comparing all the values of the column Price.  If we wanted to compare and subset based a row, we would put that after the comma.  For example, if we just wanted the Make of the car, we could do this:

mydata[mydata$PRICE %in% car_returns, mydata$MAKE]

Tuesday, January 23, 2018

Dahua IP Cam Setup

  1. Power: this thing doesn't come with a power source.  Seriously
    1. PoE or 12V input.  Do not do both!
    2. If your camera is outside, do PoE.  Look for something like this to boost your power: Single Gigabit Port PoE+ Injector – 30W – 802.3at https://www.amazon.com/gp/product/B00B4H00EO/ref=oh_aui_detailpage_o00_s00?ie=UTF8&psc=1 
    3. If you're going to do 12V
      1. BV-Tech DC12V 1A UL-Listed Switching Power Supply Adapter for CCTV - 5 Pack - Black
  2. Connecting to your camera
    1. Connect the camera via ethernet to your router/switch
    2. IE works OK but has issues.  Use chrome, it will prompt you to download and use a specific app.
    3. There appears to be no default IP - it's DHCP.  So log into your router and see what IP's are connected to it.  Mine always landed at 192.168.1.14 or .15.
    4. Open a browser to http://
    5. Default password is admin/admin

  1. Basic Setup
    1. Change the password:
      1. System | Account | click the pencil under "modify" | check "modify password"
    2. Connect to wifi:
      1. Network | wifi | check "enable" | double click the correct network and put in password
    3. Set the wifi IP address:
      1. Network | TCP/IP | At "ethernet card" click the dropdown and select "Wireless" | click the "static" button" | enter an IP address that you would like this camera to live on permanently | Click Save
      2. Set a new name for your device - something like "garage" or "front door" | click Save
      3. open the browser to your new IP address
      4. Unplug the ethernet cable
    4. Upgrade to latest code (very important if you don't want to get hacked)
      1. Find your device's latest firmware at https://dahuawiki.com/Firmware_Search_Tool/IP_Camera
      2. Unzip the package
      3. My firmware file was called DH_IPC-ACK-Themis_EngSpnFrn_N_V2.400.0000.15.R.20170804.bin
      4. System | Upgrade | Browse | Select the firmware file
      5. Click upgrade
    5. Set System Time
      1. System | General | Date&Time | Set your GMT time | Save
      2. Set DST here if you want.
  2. Advanced Setup
    1. Enable HTTPS (this is really important, it encrypts your connection to your camera)
      1. Network | Create | fill in all the boxes | change duration to 5000 days | click save
      2. Click install | Click download | Click save
      3. Check the box for "enable HTTPS" | Click save
    2. I set frame rate from 30 to 10 and enabled smart codec. https://www.dahuawiki.com/News/H.264_Plus_vs._H.264

Monday, December 18, 2017

Bandcamp, Music and Android

Here's a quick walkthrough on how to download music from bandcamp.com into your android phone.  First, you'll need to download an unzip app - Easy Unrar is free. Find it in the app store (Google Play) and download it.

I'll throw red dots on these screenshots to make it easy for you to follow along.


















Next, get your download code and click on the link - for example, https://hopehymns.bandcamp.com/yum.  Input your code and click next.




Now click "here's how"

Click the second "Here's how", then select your file type (I used MP3 220, but I'm not a music expert so YMMV).   Then click download.




Great!  Your music is downloaded.  Now we need to unzip it.  Open Easy Unrar and click the "Download" folder.
















If you've had your phone awhile, it might be tough to wade through all the downloads and find your album.  So click on the sort button on the upper right.















Choose sort by file size, large to small.

















Since your album is probably more than 100MB, it should be near the top.  Here you can see Hope Hymns, the album i want.  Check the box and click Extract.


Check this box and click extract as well.

















 OK - good news and bad news.  Bad news is you're not done, good news is you're almost done.  Now we need Google Music to rescan and discover your unzipped album files.  Open up the Settings app and click on Applications.









 Now find and click Google Play Music









Almost done!  Click Storage


















!Important!  DO NOT click "Clear Data."  DO click "Clear Cache"

















Now reboot your phone, open up Google Play Music, and you're done!  Enjoy.

Thursday, September 21, 2017

Pandas

This is just a running list of useful things about Pandas as I learn.




If you have data coming in from a .csv, use this: df=pd.read_csv('file.csv')

If your dataframe has strings and you want them to be numbers, use this: df.column = pd.to_numeric(df.column, errors='coerce')

If you have date/time in linux epoch format, you can convert using this: df['date'] = pd.to_datetime(df['date'],unit='s')

If you want to index on select columns: df.ix[:,:2]

df.describe() gives you min, max, avg, mean, percentiles, and std

df.groupby([column.other]).mean()

df[column.other == ]


Grab specific columns by name: df1 = df[['a','b']]

data.iloc[:, 0:2] # first two columns of data frame with all rows

this will square each cell, skipping the first column.  df.iloc[:,1:7]=df.iloc[:,1:7].apply(numpy.square, axis=0)

great link on managing jupyter: https://www.datacamp.com/community/tutorials/tutorial-jupyter-notebook

Great guide to pandas: http://www.lining0806.com:1234/pandas/Pandas%20DataFrame%20Notes.pdf

https://www.shanelynn.ie/select-pandas-dataframe-rows-and-columns-using-iloc-loc-and-ix/
https://www.youtube.com/watch?v=POe1cufDWFs

Some images you can ignore:






Thursday, March 30, 2017

Rancher Setup Basics

A client asked recently how SolidFire can integrate with Rancher.  I had a few RHEL servers available, so I'm going to set up Rancher on RHEL.  Here are the first steps:

Install a supported version of Docker (align compatibility for Docker, K8s, and Rancher): curl https://releases.rancher.com/install-docker/1.12.sh | sh

sudo service docker start

sudo docker run -d --restart=unless-stopped -p 8080:8080 rancher/server

Alright, let's pause here.  What did we just do?  First, we installed Docker.  Docker is the software that enables you to easily download, create, run, and manage containers.  Next we made sure the docker service was running.  Last we downloaded a container that will run the Rancher software.  At this point, you should be able to reach rancher's gui at :8080.

So let's get K8s and Trident up!  First we need a place to deploy K8s.  Click Infrastructure | Hosts.

Click add host.


And then save

Then enter the IP address of the server that will function as a host for containers.  Follow the instructions to copy-paste the command into a console on your new host server.


Done!



https://docs.rancher.com/rancher/v1.5/en/installing-rancher/installing-server/#single-container

Why Rancher: http://rancher.com/beyond-kubernetes/