In the last months we had two great events and now you can watch online both of them.
The first event was held on January 21st and we discussed if Barcelona can become a European hub for Advanced Analytics and Big Data. As speakers we had Josep Maria Martorell (Associate Director at the Barcelona Supercomputing Center) and Òscar Sala (mVentures Director at the Mobile World Capital Barcelona organization). You can watch it in this link below.
The second event was held on February 14th (Valentine Day!!) and we reviewed how Analytics can play a role in Sports (are we close to a Money Ball world?) As speakers we had Sergi Oliva (Senior Director, Analytics & Strategy at Philadelphia 76ers) and Javier Fernandez (Head of Sports Analytics at FC Barcelona). You can watch in this link below.
We are pleased to announce our next event “Sports Analytics” on February 14th, 19h at Movistar Centre. Doors will open at 18:45. You can register here.
In this session we will focus on how Analytics can play a role in Sports. Are we close to a Money Ball world? As speakers we will have Sergi Oliva (Senior Director, Analytics & Strategy at Philadelphia 76ers) and Javier Fernandez (Head of Sports Analytics at FC Barcelona).
This event could not be possible without the collaboration of Movistar Centre.
In the last months we have seen that Ethics has emerged as an extremely sensitive topic for Data and Analytics community. Most likely, one of the main drivers of this wave of concern was Facebook scandal: Mark Zuckerberg (founder and CEO of Facebook) had to testify in front of US Congress about how his company handles its users’ data and how this could have influenced results in recent elections in several countries. But Facebook is not the only company whose practices are under scrutiny. Tones of questions have also been raised regarding how much personal data Google collects and how this is being used: according to Guillaume Chaslot (an ex-Google engineer), the Youtube algorithm “does not appear to be optimising for what is truthful, or balanced, or healthy for democracy”.
In other words, we are talking not only about privacy but also on how data could even threaten our political system. As Cathy O’Neil writes in her must-read book Weapons of math destruction, “the math-powered applications powering the data economy were based on choices made by fallible human beings. Some of these choices were no doubt made with the best intentions. Nevertheless, many of the models encoded human prejudice, misunderstanding and bias into the software systems that increasingly managed our lives. Like gods, these mathematical models are opaque (…) Their verdicts, even wrong or harmful, were beyond dispute or appeal. And they tended to punish the poor and the oppressed in our society, while making the rich richer”.
As Data-Driven professionals we cannot ignore this inconvenient truth and must address it. This is one of the reasons we at BcnAnalytics organised a session to discuss about Data & Ethics. As speakers we had Carlos Castillo (Distinguished Research Professor at Universitat Pompeu Fabra) and Gemma Galdon (Founder at Eticas Research & Consulting and Researcher at Universitat de Barcelona).
Carlos focused his talk on algorithmic discrimination. He initially reviewed the concept of discrimination from a philosophical perspective and then explained the concept of group discrimination, which means “disadvantageous treatment to an individual because he or she belongs to a specific socially salient group”. According to Carlos a further step is statistical discrimination which can be observed “when group discrimination happens because of some statistical belief, which means that someone has certain data, has looked at this data and based on statistics extracted from this data has decided to treat someone worse than another person”. After reviewing these concepts, Carlos raised the key issue: machine learning algorithms can discriminate.
Why is that? Machine learning systems take data and extract statistical beliefs from this data and therefore they are enabled to discriminate some individuals, regardless of intention and animosity. The key aspect is the consequences of this algorithm in terms of treating worst a person because he or she belongs to a group. Carlos emphasized that to avoid this discrimination, models need to optimize not only accuracy but also need to look at “the risk of two different populations of not getting the same outcome”. Carlos also highlighted how important is that systems are transparent: “if you get a negative outcome, you have to have a way to challenge this decision in a way that is effective… If I am denied a loan or parole, I need to have a way of effectively challenge the decision to say the systems was wrong in my case”.
Gemma started her talk quoting “The Fall of Public Man” from Richard Sennett. “In a city full of sensors and cameras and surveillance everywhere, where would Romeo and Juliet fall in love?”. From Gemma’s perspective, technology is changing our lives and we really need to ask ourselves: Why we are investing in technology? What kind of societies are these technologies creating or promoting? Are we building the cities that we want to build? Do we want to live in a world where everything is remembered? Do we want to live in a world where we can never forget? As she mentioned: “for the first time in history, forgetting is more expensive than remembering. Everything we do is recorded by a camera or a sensor”. Gemma, then, started to review real cases on non-expected outcomes of certain technologies. For instance, smart borders based on biometrics. They were not part of the legislative debate because they were seen “as technical amendments”, but currently biometrics have become our IDs, and certain individuals self-mutilate when they want to hide their identities. In other words, their bodies became their enemies.
Gemma asked herself: “How can we hide behind a technical amendment? And what about false positives? There is no redress mechanism”. According to her the most burning issue is we, as society, did not think technology could fail. But it fails. And this triggers the key issue: the way we do technology is very irresponsible and no one is facing the consequences of their actions, the consequences of their false positives…which might be human rights. Gemma ended her speech highlighting the fact we need to start thinking how technology is impacting our civilization: “we have the responsibility to decide how we build a social-technical infrastructure that is responsible and desirable for our generation and the next generations”.
The future is in need of smart engineers who will design and build devices, robots and software that will improve our lives. Computational thinking, which includes problem decomposition, data representation, abstraction and algorithmic skills,will be a fundamental skill set that these engineers will have, and, in fact, not only them but any person whose job involves problems solving.
Despite the importance of computational thinking, this skill set is still not widespread in the education of the children and youth of today. In this session different speakers will talk to us about how they are empowering the kids of today to become the engineers of the future.
We are very happy to announce that Linda Liukas from HelloRuby will be the keynote speaker of this exciting Schibsted evening. Linda is a Finnish developer and educator who has co-founded Rails Girls and who has written Hello Ruby: Adventures in Coding, a book to teach kids how to program in a fun way.
Accompanying Linda we will also have two educational organisations, CodeLearn and Girls in Lab, who will talk about how to bring computer programming and technology to kids.
Hi, my name is Enrico. I was an intern at BCNANalytics and started in the beginning of 2015 together with Aleix Ruiz de Villa and Josep Marc Mingot to work on an idea which was supposed to help out mostly small businesses to promote their products and services online.