Your Pot Habit Is Making Climate Change Worse

shutterstock_205435432
Photo Credit: stdesign/Shutterstock

Community Broadcaster: Using Big Data for Community Radio

community-broadcaster-10142016


Written by: Ernesto Aguilar. October 14, 2016


Amid 170,000-plus attendees and 2,700 sessions, it is hard to not be awed by the sheer spectacle that wasDreamforce, the annual conference hosted by customer relationship management tools developer and service provider Salesforce.com But beyond rumors it will buy Twitter, why should Salesforce.com and CRM matter to small and local community radio, and what can radio learn?

Whether it is Tony Robbins or U2, Dreamforce attracts big names and marquee corporations. For good reason — Salesforce.com has made a tremendous name for itself across many industries, from finance to retail to every sector of technology imaginable. Some of the world’s biggest nonprofits use it to manage donor relations.

“All good,” you say, “but who cares?”

Hear me out. The noncommercial media space, including community radio and public media, has much to learn from successful nonprofits using data and technology to grow. The analytics revolution that Salesforce.com and competitors have ushered into modern life is also a chance for community radio and public media to assess what is most important. It matters because contributors have new expectations. It also matters because technology can help stations focus less on paperwork and more on the relationships with their supporters.

Three key things at Dreamforce struck me.

Community radio can use technology to grow what people expect of it. At Dreamforce there were so many instances of nonprofits using data, mobile and service to engage supporters in ways that press community radio to consider how it can inspire members and underwriters, and expand its own service. One UK nonprofit takes public concerns for the homeless to smartphones by allowing geolocation of people in need to service providers. Black Girls Code and Code 2040leaders shared stories about how they made alliances with businesses work best for their constituencies. Discussions like this are incredibly instructive for community radio, which often fancies itself as a voice for localism and subcommunities. Technology gives a chance to realize these ideals in a new, dynamic and creative way.

Community radio needs to embrace the new normal of data. Community radio collects all manner of information — recordings, volunteer information, etc. — but is missing a golden opportunity to do what it does better. More and more nonprofits are seeing how important it is to use data to show donors they care. Others still struggle. On the corporate side, Apple can tell you what a customer prefers and what they buy. Similarly, more and more nonprofits can track what a donor supports most, their average gift and when they’re most inclined to give. This level of tracking is eschewed in some circles as invasive. However, the reality is that more people, particularly those who give to charity, are those who organizations need to value more. In my public media work, I’ve talked to many members who feel the fact an organization doesn’t know their giving habits equates to not caring about them personally. The world today has conditioned most people to expect connectedness as never before. They expect to give out an email address and assume an organization has their billing information and giving history on file. Yet a 2014 study indicates catering to the new expectations of customers is among the lowest priorities. Community radio would benefit by switching it up.

The touch always matters most. Among the tiny and massive nonprofits at Dreamforce, the objective of all of these cool gizmos was clear: to make each organization’s people more effective at what they do, and to enable them to have the most information possible for quality contacts with donor-members. Staff change, addresses change, but all nonprofits know their communication needs to be consistent and smart. The longtime supporter should have assurances that even new people know their importance to an organization, their history and what matters to them. A new donor should have regular, but unobstrusive, contact and a smooth ride into an organization’s world. As community radio leaders are well aware, it is tough to raise money and convert the casual observer to active giver. Technology can only enhance the contact, but it’s that moment that matters most.

Community and public radio, and, really, all nonprofits, have some common cause in how your average business relates to a consumer. Where a business is trying to sell you an aesthetic, such as trust, a community radio station wants you to give money out of a higher ideal: mission, culture or a contribution to the commons. Community radio outlets are special snowflakes all, but we share the same challenges. Dreamforce demonstrates but one example of ways to tackle our biggest puzzles.


Ernesto Aguilar is membership program director of theNational Federation of Community Broadcasters. NFCB commentaries are featured regularly at radioworld.com.


 


Article Disclaimer: This article was published at the Radio World and retrieved on 10/15/2016 and posted at INDESEEM for information and educational purposes only. The views, ideas, materials, and content of the article remains those of the author. Please cite the original source accordingly.


 

Using Big Data to Predict Terrorist Acts Amid Privacy Concerns

big-data-580x435


By | October 13, 2016


Before Ahmad Khan Rahami planted bombs in New York and New Jersey, he bought bomb-making materials on eBay, linked to jihad-related videos from his public social-media account and was looked into by law enforcement agents, according to the Federal Bureau of Investigation.

If only the authorities had connected the dots.

That challenge — mining billions of bits of information and crunching the data to find crucial clues — is behind a push by U.S. intelligence and law enforcement agencies to harness “big data” to predict crimes, terrorist acts and social upheaval before they happen. The market for such “predictive analytics” technology is estimated to reach $9.2 billion by 2020, up from $3 billion in 2015, according to research firm MarketsandMarkets.

It’s the stuff of a science-fiction movie like “Minority Report,” in which Tom Cruise played a Washington cop who used technology to arrest people before they carried out crimes. It’s also a red flag for privacy advocates already fighting U.S. spy programs exposed by Edward Snowden and the FBI’s demands that Apple Inc. help it hack into encrypted mobile phones.

The idea is to make sense of the vast and disparate streams of data from sources including social media, GPS devices, video feeds from street cameras and license-plate readers, travel and credit-card records and the news media, as well as government and propriety systems.

‘Fundamental Fuel’

“Data is going to be the fundamental fuel for national security in this century,” William Roper, director of the Defense Department’s strategic capabilities office, said at a conference in Washington last month.

For the first time, the White House released a strategic plan on Wednesday to advance research and development of artificial intelligence technology, including to predict incidents that may be dangerous to public safety.

Weeks before Rahami allegedly carried out the attacks in September, he bought circuit boards, electric igniters and ball bearings — all of which are known bomb-making materials, according to charging documents from the FBI.

In previous years, he was flagged by U.S. Customs and Border Protection and the FBI after he made trips to Pakistan and after his father told police he was a terrorist, before recanting the remark.

Law enforcement agents could have been tipped off that Rahami was moving toward an attack had all of those data points been culled together in one place, said Mark Testoni, chief executive officer and president of SAP National Security Services Inc., a U.S.-based subsidiary of German software company SAP SE.

“This is a big data world now,” said Testoni. He said his company has developed a computer platform for doing predictive analytics that is being used in a limited way by a Defense Department agency and by a national security agency. He declined to name the government customers or specify what they are doing.

The technology to predict events is only in its infancy, Testoni said. National security and law enforcement agencies also have different rules when it comes to obtaining and using data, meaning there are walls between what can be accessed and shared, he said. U.S. law enforcement agencies, for example, need a court warrant to access most data.

Big Brother

Privacy advocates express concern about the “Big Brother” implications of such massive data-gathering, calling for more information and public debate about how predictive technology will be used.

“There’s often very little transparency into what’s being brought into the systems or how it’s being crunched and used,” said Rachel Levinson-Waldman, senior counsel to the National Security Program at the Brennan Center for Justice at New York University School of Law. “That also makes it very hard to go back and challenge information that might be incorrect.”

Computer algorithms also fail to understand the context of data, such as whether someone commenting on social media is joking or serious, Levinson-Waldman said.

Testoni’s company and others such as Intel Corp. and PredPol Inc. are among a handful of firms pioneering the use of predictive analytics and artificial intelligence for clients from local police departments to U.S. national security agencies.

More than 60 local police departments in the U.S. have started making use of a service sold by PredPol, which calls itself “The Predictive Policing Company,” to forecast where crimes might occur based on past patterns, said co-founder Jeff Brantingham.

What, Where, When

Its system, developed in collaboration with the Los Angeles Police Department, uses only three types of data: what type of crime occurred, when and where, Brantingham said.

Then, a software algorithm generates the probability of crime occurring in different locations, presented as 500-foot-by-500-foot squares on a computer display or a printed map. With that insight, police departments then can make decisions about how best to apply their resources, such as sending cops to a high-risk area, or which security cameras to monitor, Brantingham said.

PrePol’s system doesn’t make predictions about who will commit a crime, so it stops short of a system that might identify a terrorist in the making.

“Interdicting places is, by and large, an approach that is more in line with protecting civil liberties than interdicting people,” Brantingham said.

Even with such limits, privacy and civil liberties groups oppose the use of predicting policing technology as a threat to the Constitution’s promises of equal protection and due process.

‘Fortune-Teller Policing’

“This is fortune-teller policing that uses deeply flawed and biased data and relies on vendors that shroud their products in secrecy,” Wade Henderson, president and chief executive officer of the Leadership Conference on Civil and Human Rights. “Instead of using predictive technology to correct dysfunctional law enforcement, departments are using these tools to supercharge discrimination and exacerbate the worst problems in our criminal justice system.”

eBay, Amazon

Vast databases that companies have created for online commerce and communications could help law enforcement and national security agencies build predictive systems if they are allowed to tap into them. Technology companies have terms of service that set out how much personal information can be kept and sold to outside companies such as advertisers, and most resist handing over such data to the government unless a court orders them to do so.

Predictive analytics are already being used by companies like eBay Inc., Amazon.com Inc., and Netflix Inc. to crunch their users’ Internet activity to forecast what they might be interested in. Companies like Facebook Inc. and Twitter Inc. have access to over a billion social-media accounts. The storehouse of data on Americans will only grow with digital feeds from Internet-connected appliances and wearable devices.

Social media, in particular, is a valuable tool in tracking potential terrorist attacks, said Eric Feinberg, founding member of the Global Intellectual Property Enforcement Center, which is a private company. His firm has patented technology that can scan for hashtags across different social media platforms and in different languages for communications that indicate terrorist planning.

“Our software is about pattern analysis,” Feinberg said. “We focus on the communications stream.”

‘Open Source Indicators’

The U.S. government is working on initial efforts to gain insight into global social and political trends.

A program under the intelligence community’s research arm called Mercury seeks to develop methods for continuous and automated analysis of intercepted electronic communications “in order to anticipate and/or detect political crises, disease outbreaks, terrorist activity and military actions,” said Charles Carithers, spokesman for the Intelligence Advanced Research Projects Activity.

The agency also previously funded the Open Source Indicators program, which “developed methods for continuous, automated analysis of publicly available data in order to anticipate and/or detect significant societal events,” such as mass violence and riots, mass migrations, disease outbreaks and economic instability, Carithers said.

CIA Forecasts

The CIA draws a distinction between using technology to anticipate events, versus predict them. The agency is using sophisticated algorithms and advanced analytics, along with publicly available data, to forecast events. The initial coverage focuses on the Middle East and Latin America.

“We have, in some instances, been able to improve our forecast to the point of being able to anticipate the development of social unrest and societal instability to within three to five days out,” said Andrew Hallman, the agency’s deputy director for digital innovation.

In its annual report in June, the Defense Science Board said, “Imagine if national leaders had sufficient time to act in emerging regional hot spots to safeguard U.S. interests using interpretation of massive data including social media and rapidly generate strategic options.”

“Such a capability may soon be achievable,” the board said. “Massive data sets are increasingly abundant and could contain predictive clues — especially social media and open-source intelligence.”

Poindexter’s Legacy

If U.S. intelligence agencies develop an advanced system to predict terrorist acts they might call it “Total Information Awareness.” Except that name has already been used, with unhappy results.

Retired Admiral John Poindexter created the “Total Information Awareness” program for the Pentagon’s Defense Advanced Research Projects Agency in 2002 to find and monitor terrorists and other national security threats using data and technology.

The program became so controversial, especially over concerns that privacy rights would be violated, that Congress canceled funding for Poindexter’s office in 2003.

Having been there and done that, Poindexter now says predicting terrorism is possible but would require a lot of data, such as banking information, analysis of social media, travel records and classified material.

The system also has to include strong privacy protections that the public can review, said Poindexter, who said he was working on such a “privacy protection application” when his program was canceled.

“You have to develop public trust in the way this is going to work,” said Poindexter, who continued developing the technology after leaving government through Saffron Technology Inc., a cognitive computing company that Intel bought in 2015 for an undisclosed price. Intel declined to comment.

“The government’s priorities should be to solve the privacy issue and start ingesting massive amounts of data into memory bases,” Poindexter said. “You have to get the public on board with the idea that we can collect and search information on terrorist planning that doesn’t have an adverse impact on innocent people.”


Article Disclaimer: This article was published by Insurance Journal and was retrieved on 10/15/2016 and posted here at INDESEEM for information and educational purposes only. The views, ideas, materials and content of the article remains those of the author. Please cite the original article accordingly.