For a society that can’t get enough of technology, it’s interesting that the current spotlight on AI has sent some spiraling. Nazlıcan Yöney, a marketing vet (she’s held posts at Google and Microsoft) and the founder and creative director of House of Sól jewelry, simplifies the concepts that make AI so powerful, and why the creative industry should embrace it.
Professor John McCarthy by Nazlıcan Yöney
Dear Reader,
Are you one of those people who have been pouring their AI-made portraits onto Instagram in the past months, or are you one of those who watch them with judgy eyes and quietly ask themselves, "For God's sake, what are you doing?" before unfollowing them?
Frankly, I watch with great excitement, almost as if I were holding a bag of popcorn in my hand, the technologies of artificial intelligence that are transforming all lives and industries – including the creative one – in such a rapid and destructive way.
That's why I can't wait until the end of the article to share my two cents about the issue as an industry veteran: it's so relieving that finally, artificial intelligence has come into the picture and the fake artists who painted shoddy pictures in Times Square have been left without a job. We have all been exposed to enough mediocre art to last a lifetime. But, does this mean that artists will starve in the near future? Of course not. Are photographers hungry when everyone on the street has a 48-megapixel camera on their phones today? Nope. In this brave new world, access to technology has become much more democratized; this means that the gifted have a more accessible chance to grow their talent, and mediocre photographers finally have to withdraw from the market. You know what they say: survival of the fittest.
What's the secret behind the artificial intelligence chatbots having more IQ than some of your colleagues, or photo-realistic images that are made in seconds? Why is it now gaining popularity after years of development, and where will this 'AIssance' lead us? Let's dive into exploring the origin story to get a better understanding.
As a matter of fact, artificial intelligence has been in our lives for much longer than we realize. Actually, it could be older than most of you, my dear readers, since its birth date is from the 1950s! For the past 70 years, hundreds and thousands of scientists from all over the world have been attempting to determine the definition, limits and usage areas of artificial intelligence collectively. Just like human history and knowledge, artificial intelligence has progressed through a cumulative method of collecting and developing information; developing and training in such a unique and collective way has brought us knot by knot to where we are today. But, how did this vision start, and what was its purpose? I want to ask very fundamental questions, like "AI for Dummies" level: What is the connection between its name and its areas of focus? Artificial Intelligence.
Nazlıcan Yöney
Doesn't it sound like an oxymoron adjective when you first hear it? Professor John McCarthy, widely regarded as one of the founding fathers of artificial intelligence, was the first to use the technical term and had a vision for AI development that was both practical and ambitious. At the Dartmouth Conference in 1956, he unveiled his revolutionary vision to the world and made history – establishing this moment as both the birthplace and time of origin for 'artificial intelligence' as a concept. Consequently, it also gave birth to its corresponding field of study.
His goal was to create machines that could replicate human intelligence and problem-solving capabilities, making it possible for computers to perform tasks that would otherwise require human cognition. His ambition was to build machines that mimic the human brain. But it was not a one-way road, for sure. He genuinely believed that by creating machines that could think and learn, researchers could not only improve technology, but also gain a deeper understanding of human intelligence and cognition. The aim was to formulate every aspect of learning or any other feature of intelligence so precisely that a machine can be made to simulate it. How fascinating is that?
To that end, the professor and his team invited computer scientists, mathematicians, cognitive psychologists and engineers from all around the world to this historic conference who were interested in the idea of creating machines that could think and learn like humans. Early research in the field focused on developing "thinking machines" that could perform tasks that normally would require human intelligence, such as understanding natural language, recognizing objects and patterns and making decisions based on them. To achieve this, the machine was taught to think like a human brain by utilizing well-known methods such as rule-based systems and decision trees – much like those we use when raising our children.
Machine learning algorithms use statistical techniques to identify patterns in data, enabling them to make predictions or decisions without being explicitly programmed to perform a specific task. While this may sound frightening, it's actually quite exciting! With its incredible potential for innovation and automation, machine learning is truly an awe-inspiring discovery. Instead of providing explicit instructions to the given program, a large dataset is fed into its algorithm, and this allows the machine to "learn" how to perform tasks by recognizing patterns and correlating them with relationships in the data set. In this manner, machines can gain more autonomy and become more independent from human intervention for task completion.
Does that sound too abstract? To make it easier to understand, let's use a simple example. Suppose that you give your algorithm images of cats and dogs to teach it the distinction between these two animals. Then, you upload a novel photo that wasn't included in its training dataset before and mere seconds later, the program can accurately identify what is on the image as a dog. How? Well, it has just learned!
Now, let's simplify these concepts and bring them together. AI can be thought of as a toolbox filled with different parts, and one of the parts is machine learning. But don't get me wrong – it is not a minuscule one. This component plays an integral role in how we use AI today. Thanks to machine learning, fields such as computer vision and natural language processing have been paved, and we are discussing these issues with you today. Even compiling this article was a breeze due to the help of artificial intelligence. Could AI supplant my work as an author in the future? Absolutely not! Instead, technology should be viewed as potential opportunities that we can grasp and use to produce superior output with much fewer hours spent on it.
Every transformation that occurs in our lives can cause either a crisis or an opportunity; it all relies on how well we are prepared and the outlook with which we face each situation. If you're daunted by the rapid pace of technology, let's take a step back and reassess. Let's think outside the box and push ourselves to explore new possibilities. Hardly two decades ago, the smartphones we read about in this article didn't even exist. Fast-forward to today, and there are millions of people who owe their livelihoods solely to the technology these devices offer.The secret is to keep your head up and never stop reading, researching, experimenting, and learning new areas and tools. Just as Oscar Wilde said, "We are all in the gutter, but some of us are looking at the stars."