Categories
conference Data Technology

Your algorithmic future: weapons of maths creation and destruction

Science Fiction writer William Gibson said “The future is already here, it’s just not widely distributed.” When you look around you can see the truth of that statement. Most of the technologies that will influence us over the next few decades already exists. In many ways it feels like we’re living in parts of that future. We can 3-D print replacement jaws for people. And 3D printing was invented over 30 years ago. In NDRC, where I work, we have companies working on embedded sensors for post operative bleed detection, and working on helping kids with focusing and ADHD problems through neuro-feedback game play. [1]  In many ways technology is enriching our lives. In reality the title of this piece is less ‘Our Algorithmic Future’ than ‘Our Algorithmic Present’.

As a technophile that’s very exciting. I have a deep and abiding love of science and the wonderful possibility of technology. I grew up reading Isaac Asimov (his science and his fiction), Arthur C Clarke and Carl Sagan. And watching Star Trek, Tomorrow’s World and other optimistic visions of technology and the future.

At the same time there is a darker side to technology. Paul Erlich said “To err is human, to really foul things up requires a computer.” It’s not hard to find examples. California released 450 high-risk, violent prisoners, on an unsuspecting public in 2011, due to a mistake in its computer programming. ‘We-connect’ an app based vibrator which captures the date and time of each use and the selected vibration settings, and transmits the data — along with the users’ personal email address — to its servers in Canada “Unbeknownst to its customers” a number of whom are now suing the company.[2]

And most dark of all is the case of the firing of elementary school teacher Sarah Wysocki by Washington DC Public schools. The school system used “VAR”, a Value Added statistical tool to measure a teacher’s direct contribution to students test results. Despite being highly regarded in classroom observations the low score from the algorithm led to her being fired. There was no recourse or appeal. And no way to really understand the working of VAR as they are copyrighted and cannot be viewed.[3]

Computer Says No There is this abstract notion of what the computer said or what the data tells us. Much as the complex gibberish that underlay the risk models of economists and financial services companies in the run wasn’t questions (because maths) the issue here isn’t the algorithms as much as people and their magical thinking.

 

I came across this quote from IPPN Director Sean Cottrell, in his address to 1,000 primary school Principals at Citywest Hotel in 2011.[4]  He commented

‘Every calf, cow and bull in the State is registered by the Department of Agriculture & Food in the interests of food traceability. Why isn’t the same tracking technology in place to capture the health, education and care needs of every child?’

Well intentioned as it might be, this shows a poor understanding of cows, a worse understanding technology and dreadful misunderstanding of children and their needs. I find this thinking deeply disturbing, and profoundly creepy so I decided to unpack it a little.

This is how we track cows
Cow
And this is how we start that process by tracking calves
Calf
And I wondered is this how he’d like to track children? (H/T to @Rowan_Manahan for that last image)
screenshot-2016-10-16-16-54-37

Then I realised that we are already tracking children.
KidsOnly its not the Primary Principles Network that doing it, it is private companies doing the tracking and tagging. It is Google and Facebook and Snapchat, with some interesting results and some profound ethical questions. We now know that Instagram photos can reveal predictive markers of depression and that Facebook can influence mood, and peoples purchasing habits.[5]

Our algorithm present is composed of both data and algorithms. We have had an exponential growth of processing capability over the last number of years, which has enabled some really amazing developments in technology. Neural Networks emerged first in the 1950s dimmed in the late 1960’s, reemerged in the 1980s and has taken off like wildfire in the last few years.The Neural Network explosion is down to the power, cheapness and availability of GPU’s, together with improvements in the algorithms themselves. And Neural Networks are really really good at some kinds of pattern analysis. We are getting to a point where they are helping radiologists spot overlooked small breast cancers. [6]

There is also a very big problem with algorithms. The problem of the Black Box. The proprietary nature of many algorithms and data sets mean that only certain people can look at these algorithms. Worse we are building systems in a way where we don’t necessarily understand the internal workings and rules of these systems very well at all.
BlackBox
Black boxes look like this. In many systems we see some of the input and the output. But most is not only hidden its not understood. In a classic machine learning model. We feed in data and apply certain initial algorithms. Then we use it prediction or classification. But we need to be careful of the consequences. As Cathy O’Neill cleverly put it Donal Trump is an object lesson in Bad Machine Learning. Iterate on how crowd reacts to what he says and over optimise for the output – Classic problem of Machine Learning trained on bad data set. We need to think about what the systems we’re building are optimising for. [7]

George Box said that “All models are wrong but some are useful.” Korzybski put it more simply “The Map is not the territory.” And its important to remember that an algorithm is a model. And much as the human mind creates fallible biased models we can also construct fallible computer models. Cathy O’Neill put it bluntly that “A model is no more than a formal opinion embedded in code.” The challenge is that the models are more often than not created by young white males from an upper middle class or upper class background. It is not that human brains are perfect model makers but we spend a long time attempting to build social processes to cope with these biases. The scientific method itself is one of the most powerful tools we’ve invented to overcome these biases.

As we unleash them on education, (Sarah), Policing (pre-crime in chicago) and health and hiring we need to be aware of the challenges they pose. Suman Deb Roy has pointed out

Algorithmic systems are not a settled science, and fitting it blindly to human bias can leave inequality unchallenged and unexposed.  Machines cannot avoid using data.  But we cannot allow them to discriminate against consumers and citizens. We have to find a path where software biases and unfair impact is comprehended not just in hindsight. This is a new kind of bug. And this time, punting it as ‘an undocumented feature’ could ruin everything. [8]

Bernard Marr illustrates this with an example

Hiring algorithms. More and more companies are turning to computerized learning systems to filter and hire job applicants, especially for lower wage, service sector jobs. These algorithms may be putting jobs out of reach for some applicants, even though they are qualified and want to work. For example, some of these algorithms have found that, statistically, people with shorter commutes are more likely to stay in a job longer, so the application asks, “How long is your commute?” Applicants who have longer commutes, less reliable transportation (using public transportation instead of their own car, for example) or who haven’t been at their address for very long will be scored lower for the job. Statistically, these considerations may all be accurate, but are they fair? [9]

There is an old saying in tech: “GIGO: Garbage In Garbage Out” the risk now it that this will will become BIBO “Bias in and BIAS out”

As we gather vast amounts of data the potential for problems increase. There can be unusual downstream consequences also the opportunity to create perverse incentives. We are embedding sensors in cars, and looking the idea that safer driver will be given better rates. The challenge is that personalised insurance breaks the concept of shared risk pools, and can drive dysfunctional behaviour. Goodhart said “When a measure becomes a target, it ceases to be a good measure.” We had a significant recent Irish example with crime statistics where the CSO pointed out problems with both the Under-recording by police of crime and the downgrading of a number of reported crimes. [10]

At one level I see our future as a choice between, Iron Man – technology to augment, or Iron Maiden – technology controlled by a few that inflicts damage on the many. Technology to augment or to constrict . Technology  changes that threaten the self also offer ways to strengthen the self, if used wisely and well.

screenshot-2016-10-16-17-06-49

It is clear that technology does not self-police. We could cut off the use of phones in cars using technology – so it can’t be used while driving but the companies doing so currently choose not to do so

In Europe we have our own bill of rights – a charter of fundamental rights enshrined in the Lisbon treaty and it guarantees “Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.”  This right has been used to challenge the export of data from the EU to the US under the Schrems decision of the European Court of Justice. [11]

My belief is that we need to extend these rights in the algorithmic era. We need to create a “Charter of Algorithmic Rights” For our algorithmic age. Not a Magna Carta  which really just enabled the lords against the king without much for the the peasants. We need algorithmic rights, of the people, by the people and for the people.

CrashSimply put we need airbags for the algorithmic age. For decades cars have safer for men than women because the standard crash test dummy tests on male size standard and biases the development of safety towards the average male. As I said, technology is not self policing. [12]

 

 

 

We are going to have to create better tools. We need to be able to detect, and correct bias and to audit and ensure fairness over a simple move to efficiency. Or else we are tying things together in unforeseeable ways that can have profound consequences at the individual and societal level. Tools such as Value in Design and Thought experiments help. But we need to go much further.

 

Kate Crawford writing in Nature says

“A social-systems analysis could similarly ask whether and when people affected by AI systems get to ask questions about how such systems work. Financial advisers have been historically limited in the ways they can deploy machine learning because clients expect them to unpack and explain all decisions. Yet so far, individuals who are already subjected to determinations resulting from AI have no analogous power.” [13]

Augmentation

While this is necessary I don’t believe it’s sufficient. We need a “Charter of Algorithmic Rights“. While looking to the opportunities they can afford we need to recognise the biases and limitation of technology.  What appears to be augmentation may not really be the case. It may restrict and rule rather than enable.

 

 

We need to ensure that are tools are creative and reflect the diversity of human experience.

– (C) BBC / BBC Studios – Photographer: Ben Blackall

We are better managing them than being managed by them in our algorithmic future.

 

 

Footnotes.

[1] The companies mentioned are Enterasense and Cortechs.

[2] Computer errors allow violent California prisoners to be released unsupervised can be found here and the story on the app based vibrator is here.

[3] One link to the Sarah Wysocki story is here for more details read Cathy O’Neills excellent book “Weapons of Math Destruction” or take a look at Cathy’s blog.

[4] Original Link was Tweeted by Simon McGarr. The piece is here http://www.ippn.ie/index.php/advocacy/press-releases/5000-easier-to-trace-cattle-than-children

[5] How an Algorithm Learned to Identify Depressed Individuals by Studying Their Instagram Photos  https://www.technologyreview.com/s/602208/how-an-algorithm-learned-to-identify-depressed-individuals-by-studying-their-instagram/  and https://arxiv.org/pdf/1608.03282.pdf  Everything we know about Facebooks mood manipulation  http://www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/

[6] http://www.cancernetwork.com/articles/computer-technology-helps-radiologists-spot-overlooked-small-breast-cancers  Neural Nets  may be so good because they map onto some fundamental principles of physics http://arxiv.org/abs/1608.08225:

[7] Trump as a bad Machine Learning Algorithm https://mathbabe.org/2016/08/11/donald-trump-is-like-a-biased-machine-learning-algorithm/

[8] Genesis of the Data Drive Bug https://www.eiuperspectives.economist.com/technology-innovation/genesis-data-driven-bug

[9] Bernard Marr The 5 Scariest Ways Big Data is Used Today http://data-informed.com/the-5-scariest-ways-big-data-is-used-today/

[10] What is the new Central Statistics Office report on Garda data and why does it matter?
http://www.irishtimes.com/news/crime-and-law/q-a-crime-rates-and-the-underreporting-of-offences-1.2268154
and CSO (2016) http://www.cso.ie/en/media/csoie/releasespublications/documents/crimejustice/2016/reviewofcrime.pdf

[11]DRI welcomes landmark data privacy judgement https://www.digitalrights.ie/dri-welcomes-landmark-data-privacy-judgement/ and Schrems v. Data Protection Commissioner https://epic.org/privacy/intl/schrems/

[12] Why Carmakers Always Insisted on Male Crash-Test Dummies
https://www.bloomberg.com/view/articles/2012-08-22/why-carmakers-always-insisted-on-male-crash-test-dummies

[13] There is a blind spot in AI research Kate Crawford& Ryan Calo
http://www.nature.com/news/there-is-a-blind-spot-in-ai-research-1.20805

Categories
Social Media Technology

Sometime more than 140 characters are needed

Sometimes you can say a lot in 140 characters. Sometimes you can’t. There were two tweets on Friday night. Close to each other tied to the attacks in Paris.

I tweeted

And my friend and former Storyful colleague Paul tweeted

The contrast caught some people by surprise. So I thought I’d expand on my thoughts. Sometimes more than 140 characters are needed. I love Twitter. It is possibly my favourite piece of technology and I say that as someone who has lived, eaten and breathed technology for more than 20 years. It can be a glorious human sensor network reflecting the pulse of the planet. A learning library, a pub conversation a place of humour or enlightenment. And a wonderful feeding point for information junkies.

I was having a quick flick through my timeline on Friday night when I saw news of what was happening in Paris starting to emerge. And then I tweeted that thought. And then I turned twitter off.

I turned it off because I have no one close in Paris. There is no information I needed. Nothing I can do to add to or help anyone in Paris and nothing I can add to the story.  There is a small irony as I the first defining moment for me on Twitter was the Mumbai bombings in 2008. Up until then it was an interesting chat room that competed with Jaiku for attention.  Mumbai underlined the power of Twitter for me. I find much of my news through twitter and a path into much emerging technology and many interesting conversations.

In 2010 I joined Mark Little in Storyful. I knew a lot about technology and very little about news. I worked with some of the smartest journalists on the planet (Mark Little,  Gavin Sheridan,David Clinch, Markham Nolan, Malachy Browne, Aine Kerr and many others) and worked in a company that helped define how journalism and technology and social media can and should interact.

I remember being in the Storyful offices the day of the Utoya killings. Hearing the first report of a bomb then of of shootings. I watched the Arab Spring unfold, and the Syrian War start. Watched shootings in the US and occasional cat videos as well. There is a visceral almost primal energy in a newsroom during a major breaking story. Journalists play an important role underscored by Ruth McAvinia’s tweet

And yet I do not miss those moments. Because of the human toll. The people who have died or been injured. I do not miss having to put in place psychological training for journalists and processes and policies for staff who see something disturbing in a piece of video that they are watching.  And there is the noise. I do not miss the noise of agendas and ideologies attempting to twist every event to their own ends.  Paul Bernal summed up many of my thoughts perfectly writing

The aftermath of the events in Paris has shown many of the worst things about the current media and social media. I’ve been watching, reading and following with a feeling, primarily, of sadness. What depresses me the most – and surprises me the least – is the way that the hideousness has been used to support pretty much every agenda.

and

All I can do is sigh. And feel more sadness. I see the points that everyone has. And yet all I feel is sadness. There isn’t an easy solution to any of this. There aren’t easy answers. There really aren’t, no matter how tempting some of the ideas might be. I wish there were.

We have a situation where even Donald Trump is misquoted (or quoted out of time). Most of this adds noise and not signal. We need more reflective thought along the lines of Zeynep Tufekci  (read her timeline and more)  and Kenan Malik

What we need now is not noise and fear and knee-jerk actions or reactions.  What we need now is to trace the arc of these stories and to start to find deeper solutions to this problem. 

Categories
Digital Transformation News and Events Painless Change People Technology Uncategorized

Digital Age Orientation Day

If you have two and a half minutes take a look at the video below on Bronze Age Orientation day. It’s short, it’s funny and it’s true.

It underlines that the concept of change is not unique to the our digital age. No Company ever likes to transform itself (if the caterpillar had a choice, would it become a butterfly?) but in most cases they have learned to accept the fact that they have to.

Companies have been dealing with mergers, acquisitions, buyouts, restructuring, de-localisations, re-localisations for a long time. If you’ve been through it you’ll know this sort of radical transformation is slow and painful: the all-or-nothing approach forcefully pushed by many consultants is not popular and also falls flat a lot of the time. The Digital Age is here. It’s been coming for twenty years and to succeed in this environment every business needs to be a Digital Business.

We’ll be running a daylong course at the end of the month on Digital Change and how to manage it in your organisation. Our approach is gradual, offers numerous intermediate steps, defines clear deliverables and measurements and ties companies to their specific context. It’s Digital Transformation tempered by Change Management methods and experience

You can find out more here

Categories
Candid strategy Technology Uncategorized

Algorithms : prose written by people

As the ACLU highlights the problem of Algorithms that discriminate we need to remember that Algorithms are only as good as the assumptions that they’re based on.

In reality we’d be better off if we replaced the idea of Algorithms as “clean unbiased maths” with “prose instructions written by fallible people.”

But alongside the potential for bringing about social progress, the Internet also holds the possibility of contributing to unlawful discrimination. An example of this potential negative impact is a patent recently acquired by Facebook that could conceivably permit loan servicers to gain access to the credit ratings of a loan applicant’s social network and then use that information to determine whether the applicant qualifies for a loan. The patent combines the possibility of serious invasions of privacy with the realistic prospect of illegal lending discrimination.

More here.  The Guardian also had a piece on some of these problems two years ago.

Image of Justice via Wikimedia

Categories
Communication Digital Transformation Painless Change Technology Uncategorized

Technology and Change in under three minutes

I use this video regularly to highlight some key points on Technology and Change

It has the virtues of being very true and very funny.

Categories
Communication strategy Technology Uncategorized

Snails, Systems and Slack

Paul Quigley CEO of Newswhip wrote a lovely blogpost about the great Snail Derby of 1998. Faced with the problem of getting the Snails to race in the same direction, an innovative 6yo came up with Snail Trails.

 Snail trails. Snail trails are not a product you can buy. Snail trails are a streak of water, placed in front of a snail using one’s fingertip.

You see, snails prefer pushing themselves over wet surfaces than dry surfaces. My girlfriend observed that a simple streak of wetness leading directly from the snail’s current position to the finish line kept them on the straight and narrow, so to speak. Snail trails saved the day, and the snail derby of 1988 was a roaring success.

Paul goes on to describe how you create snail trails for customer acquisition for SaaS businesses.

There is a broader lesson for businesses in the use of technology.  Demming said that “A bad system will beat a good person every time.”  A snail trail is a better system. It’s a very clever use of lightweight technology to reduce friction in a process.  Good systems do that. They reduce organisational friction.  They drive better organisational conversations.   And the value of reducing friction in processes and conversations is very very large.

One company that is building organisational snail trails is Slack.  I’m a very big fan of Slack. The value of what it’s is doing is rumoured to be up to $2 Billion. Double what it was worth 12 months ago. That’s part of the value that Slack is capturing. And it reflects a small portion of value that it is creating by building better Snail Trails.

/Dermot

[author] [author_image timthumb=’on’]http://www.nearfuture.io/wp-content/uploads/2014/03/dermot2-copy1.jpg[/author_image] [author_info]Dermot has extensive expertise in the area of Digital Transformation and Strategy.[/author_info]

[button link=”http://www.nearfuture.io/contact/” color=”black”] Would you like to talk to Dermot about this?[/button]  [/author]

Our featured image is Snail Trails from Luís Estrela on Flickr

Categories
Data Security Technology

A Short Video on what happens when the data is all connected.

The ACLU produced a video a number of years ago about ordering pizza in the future. Warning or prediction. Sometimes its hard to tell the difference. About the only part that doesn’t make sense is the voice explaining what all the problems are. Algorithms don’t explain.

Categories
Data Technology

How do you categorise your children?

The Primary Online Database is one of the worst ideas I’ve come across.  Dave Molloy wrote a good piece about how easy abuse of the system could be.

I want to pick at one of the little threads. The ethnic or cultural background categorisation

Its useful when looking at a system to take apart the assumptions underlying it . There is a much longer piece to be written about categorisation and we’ll take one element here.  This is a list of drop down choices for one of the pieces of information for the Primary Online Database.

Ethnic or cultural background (drop-down list)

White Irish
Irish Traveller
Roma
Any other White Background
Black African
Any other Black Background
Chinese
Any other Asian background
Other (inc. mixed background)
No consent

The comment I used on Twitter when I first saw this list  “The word you’d use to describe the list of ethnic/cultural choices in the Dept of Eduction planned Primary Schools Database is WRONG.”

Categories are artificial ways of slicing up the world.  Dave Snowden wrote an interesting post on Categories recently. In it he quoted a passage from Aldous Huxley’s ‘Brave New World’ which is worth quoting again here.

Alpha children wear grey. They work much harder than we do, because they’re so frightfully clever. I’m awfully glad I’m a Beta, because I don’t work so hard. And then we are much better than the Gammas and Deltas. Gammas are stupid. They all wear green, and Delta children wear khaki. Oh no, I don’t want to play with Delta children. And Epsilons are still worse. They’re too stupid to be able to read or write. Besides they wear black, which is such a beastly color. I’m so glad I’m a Beta.

I was going to describe that the categories above as “not even wrong” a better description is probably “wronger than wrong.” They are wrong in that they are very poor and very distorting classification.  As Dave said in his piece

The problem with categories is that things are made to fit within the boundaries

What I’d wonder about is what is the mindset of someone who comes up with these particular categories.

Categories
People Social Media Technology

Our dopamine driven present

Hans de Zwart has written a piece on how Al Wei Wei the Chinese artist is living in all our futures. The whole thing is well worth a read. There is a lot to unpack in it. The paragraph that points to the New Intermediaries (Google, Facebook, Netflix) that sit between us and everything else is worth a book on its own. As is Disney normalising surveillance and quantified self technology through MagicBands.

I was once asked to help someone start a business that put trackers on kids.  I found it deeply creepy and still do. Though as a parent of kids a little piece at the back of your mind is going. “Well it’d be nice to be sure”. Which is why these things will probably sell despite them being creepy. The end of the piece points out another problem with constant monitoring

We need failure to be able to learn, we need inefficiency to be able to recover from mistakes, we have to take risks to make progress and so it is imperative to find a way to celebrate imperfection.

The bit that really resonated with me was on Casinos and Natasha Dow Schüll and her book  ‘Addiction by Design’.

In it, she clearly shows how the slot machine industry has designed the complete process (the casinos, the machines themselves, the odds, etc.) to get people as quickly as possible into ‘the zone’. The player is seen as an ‘asset’ for which the ‘time on device’ has to be as long as possible, so that the ‘player productivity’ is as high as possible.

The comments on the use of defibrillators in Casinos is especially disturbing. The logical jump made is that

Facebook is very much like a virtual casino abusing the same cognitive weaknesses as the real casinos.

And as Hugh MacLeod pointed out

Of course the pointless babble and the social grooming may be the very point of social media (the clue is in the name).

Steffen Banhardt on Flickr. Licensed  via Creative Commons

Categories
education Technology

Packaging up your children’s data (no date on a sale)

The Department of Education are requesting that data on all primary school children be captured into a Primary Online Database. The list of data they are asking for is comprehensive. It includes PPSN numbers.

This is what we know

A new database of primary school students will gather personal information, including PPS numbers, information on ethnic and cultural background and religion. Some details will be kept for up to 30 years (Irish Times)

List of data  includes:

• First and second names

• PPS number

• Mother’s maiden name

• Date of Birth and gender

• Full address

• Mother tongue

• Ethnicity

• Religion

• Irish language exemptions

• Enrolment date, teacher / class details

• Previous school / pre-primary education

• Learning support details

There is also a free text box for “Notes about a pupil”. The Departments reference manual on the Database states

Notes about a pupil may be entered into the ‘Notes’ tab. At present, notes entered here can be seen by Department of Education staff but this is to be changed so that only the school user may see the notes.

The Department is also claiming that

it is compulsory for parents to register their children. In the event a PPS number is not available for a student, the Department will use the mother’s maiden name to look up Department of Social Protection records.
The Department also reports that only information on ethnic and religious background requires the consent of a parent of guardian.

The data to be captured on the system may be captured on esinet.ie or an excel spreadsheet or on a schools  own system.

It is not clear that the data is safe or secure or how it is being protected by the Department of Education or the Schools.

Simon McGarr has pointed out “The Department of Education data protection notice explicitly refers to kids’ PPS Numbers as non sensitive data.”

Dept Screenshot

The Department of Education PPSN use case statement hasn’t been notified or approved by the Department of Welfare (via Simon McGarr)

The Department of Education claims support of the INTO and the National Parents Council. Speaking to one principal of a school it appears to be  very controversial, not least of all because of the huge administration workload it puts on the schools including the school being asked to retain the data for 30 years.

One member of board of  the National Parents Council (Primary) wasn’t even aware of the existence of the Primary Online Database.

I’m going to copy Simon McGarr’s tweets on this

A single database of everyone under 30, including religion, ethnicity is a major change in relationship between state & citizen. This isn’t he census- where strong legislative protections are given to the privacy of individuals. This will be a honeypot for datamining. The data is being used for a purpose other than that for which it was obtained. Storing all details of a primary school pupil until they’re 30 is excessive data retention.

The Department of Education is acting ultra vires (beyond their powers) in demanding the information.

As a parent you can write to the school and the Board of Management and inform them that

The Department is acting beyond its power. The Department of Welfare hasn’t been informed or consented to the use of PPSN.  They are breaching data protection legislation by using data for a purpose other than which it was obtained and by planning to retain that information for an excessive period of time.

Schools and boards should be aware they are personally responsible as data controllers to ensure that the data they hold is not used for a purpose other than that for which is was obtained and they are personally on risk if the department has acted ultra vires in demanding this data.

You can tell them explicitly that you don’t consent to giving them this data or to them transferring the data outside the school.

As Gavin Sheridan pointed out 

the only other database I can think of that is similar is PRIS. For the prison system

Credit for most of the points here go to Simon McGarr. Image credit to Søren Mørk Petersen Human Barcode on Flickr shared under Creative Commons.