# Artificial Intelligence as an assistant composer?



## Eric G (Jul 5, 2017)

Over the last 12 months, I have noticed several major technology developments being made in artificial intelligence directed at the music industry. Particularly with music composition.

Of course computer assisted composition has been around in various primitive forms for years, auto accompaniment, melody and harmony generators, but in the last 12 months more and more focus is on AI composing complete compositions.

And serious money is being thrown at the companies that are creating this technology. Two stand out as they have released or are about to release early versions of their AI driven products, AmperMusic and Hexachords.

In their marketing materials both have not been foolish enough to state they are replacing composers. Their products will be "Assistant" composers with parameters that can be controlled by the "composer". But their ambition is clear. Disrupt the music industry by allowing content creators to create custom music.

AmperMusic
https://www.ampermusic.com/
https://soundcloud.com/ampermusic
$5M invested in this start up ($4M in the last 6 months)
Initially targeted for short videos, Ads etc...and is available today as a Beta. Not many options for the composer but, the sound is decent. At this point not very impressive to a professional composer. But to a content creator....I am not too sure if its not close to being good enough.

Hexachords
http://www.hexachords.com/
https://soundcloud.com/hexachords
Investment unknown but it has been in development since 2012 with a small team. Private Alpha/Beta in the Fall 2017. Much more sophisticated. Chord Progressions, Mood, MIDI input/output, DAW integration. Their approach is much more palatable to a professional composer as it leverages your current investments in VI, DAWs etc...

My background is in the tech industry and startups come and go and get acquired by bigger well established companies to integrate into their products.

So how long will it take for this type of technology to start showing up in our DAWs? What if Cubase's Chord Track evolves into more? What if the traditional ARPs evolve ( look at the response to Sonuscore's The Orchestra with a rather basic ARP implementation)

For me, AI has a long way to go to replace composers, if at all. But there is no doubt that AI is coming. As Friend or Foe. You decide. Check them out and let me know you guys think.


----------



## Polkasound (Jul 5, 2017)

It will affect the music industry in that it will allow less skilled composers to create acceptable music with the same lack of skills. It will create a surge of cheap composers, driving down the cost of music production for B movies, some commercials, and some video games.

But I don't think it will reach any further. Artificial intelligence is still artificial. As good as it may become, no artificial engine can remember back to what its first kiss felt like, write a song that captures that moment, and sell it. For that, we'll always have composers.


----------



## mikeh-375 (Jul 5, 2017)

Polkasound said:


> It will affect the music industry in that it will allow less skilled composers to create acceptable music with the same lack of skills. It will create a surge of cheap composers, driving down the cost of music production for B movies, some commercials, and some video games.
> 
> _Polka sound,
> 
> This has already happened._


----------



## ctsai89 (Jul 5, 2017)

I don't think there's any need to worry about AI replacing composers. Because by the time AI could actually truly write monumental symphonies or real art music (can it be called real art if it was done by an AI?) AI would have already replaced every other jobs of other people in every professions. And agreed with @Polkasound artificial is still artificial. No matter how good/cool a machine could write a piece of music, I wouldn't call that piece "art". Art works are humanly creations


----------



## Replicant (Jul 6, 2017)

Despite my love of Sci Fi, I'm of the controversial opinion that AI and automation are actually going to make the world suck and be miserably boring, cause varying degrees of civil and political unrest, and ostensibly ruin the arts beyond all recognition; it preys on people's laziness to bolster profits under the guise of "convenience." We can replace the McDonald's workers with kiosks and screens all we want, but at the end of the day, you're still standing in the same line up.

Anyway,

I hope that AI would affect sample libraries in such a way that I can feed it raw, unadulterated MIDI on a single track and it can arrange it for orchestra in the most human, idiomatic way possible and with the click of a button, give me a completely new, random orchestration — or, I can define certain parameters or strictly tell it what lines should be what instruments. That would be amazing.

Still though, I'm really not amped for AI unless we get some sexy Cylons outta the deal.


----------



## DSmolken (Jul 6, 2017)

I think a lot of people are hoping that when AI technology gets better and starts pushing humans out of more and more jobs, their job will be one of the last ones.

I do arrangement and production. I'd actually like to replace myself with software for doing that, so I could spend more time making virtual instruments and vocalists, which is more interesting and more important to me. I've already replaced myself as a bassist a few years ago. But, to be honest, making virtual vocalists is a lot of repetitive drudgery which could be done much faster and better with some machine learning, too.


Eric G said:


> In their marketing materials both have not been foolish enough to state they are replacing composers.


Then they're not thinking big enough and fun enough. When accused of trying to replace women, a certain sex doll manufacturer responded that it's their goal to replace all humans.


----------



## HiEnergy (Jul 6, 2017)

I've been using a composing assistant (Cognitone Synfire) for quite a while. Will still take a long time until those fully replace composers.


----------



## X-Bassist (Jul 6, 2017)

I think being unique and interesting will always be uniquely human. These demos back up this theory.


----------



## Replicant (Jul 6, 2017)

X-Bassist said:


> I think being unique and interesting will always be uniquely human. These demos back up this theory.



That will be replicated too.

There is already an AI which can paint pictures that absolutely nails the style of revered artists from the renaissance.

One day, when the technology advances, AI would be able to reflect on all its experience and create something "unique" in the same way that all of your influences lead to you creating something different.


----------



## Bohrium (Jul 6, 2017)

HiEnergy said:


> I've been using a composing assistant (Cognitone Synfire) for quite a while. Will still take a long time until those fully replace composers.



I own Synfire, too, and I'd be interested in how YOU use it ... If you think it derails this thread, we can start a new one.


----------



## ChristianM (Jul 6, 2017)

Bohrium said:


> I own Synfire, too, and I'd be interested in how YOU use it ... If you think it derails this thread, we can start a new one.



I'm use also Synfire Pro for prototype since several years.
It's wonderfull too!


----------



## d.healey (Jul 6, 2017)

If it can't make coffee then it's no assistant to me!


----------



## HiEnergy (Jul 6, 2017)

Started a new thread on Synfire in the DAW subforum: https://vi-control.net/community/th...a-rapid-prototyping-software-for-music.63347/


----------



## elpedro (Jul 6, 2017)

A.I. Will replace humanity eventually, not just in composing.No doubt in my mind that it will.A.I. is already manipulating humans,if you think it isn't...well, it's doing a good job already then.The genie is out of the box,and nobody will put it back in.The race to develop it is on, nobody will want to be left behind.


----------



## Saxer (Jul 6, 2017)

Artificial intelligence isn't even able to go into the kitchen and make me a cheese sandwich, no matter how expensive the hardware is. Maybe in a few years, but when I was a child I was convinced that a robot would clean the house in the year 2000.

I think it will never happen that artificial intelligence will work like we do. Not that it would be impossible but it's in no way effecient. I think really intelligent and learning composing software will create directly digitale waveforms and change it regulated by emotional feedback of listeners.


----------



## Alatar (Jul 6, 2017)

d.healey said:


> If it can't make coffee then it's no assistant to me!



Alexa, please make me coffee.


----------



## FriFlo (Jul 6, 2017)

Saxer said:


> Artificial intelligence isn't even able to go into the kitchen and make me a cheese sandwich, no matter how expensive the hardware is. Maybe in a few years, but when I was a child I was convinced that a robot would clean the house in the year 2000.
> 
> I think it will never happen that artificial intelligence will work like we do. Not that it would be impossible but it's in no way effecient. I think really intelligent and learning composing software will create directly digitale waveforms and change it regulated by emotional feedback of listeners.


I am, not so sure about that "never" ... or why else should a brilliant thinker like Elon Musk be so afraid of AI? I don't think he's just watched to much Terminator! 
I am not afraid on the side of music, that the machine will surpass the human being. But I know the film business and its superficiality and focus on business. There, I am afraid that we could get to a point, where a human being is just a chooser and picker of what the machine offers. Listening to a lot of scores nowadays, the musical essence that is offered here makes this kind of process entirely possible. Look, how many things get done with libraries, nowadays? This is not so far from AI offering some combinations, actually! I am pretty sure there will be programs to make that possible on a grand scale. And looking at how many jobs library music is killing, this could kill even more.


----------



## Fer (Jul 6, 2017)

I heard some demos and noticed the artificiality but not the intelligence. I dont belive in AI. My mobile phone does not have AI, but it has a lot of artificial stupidity...and at the end i guess that this is what Elons Musk and all are afraid of.


----------



## Eric G (Jul 6, 2017)

ChristianM said:


> I'm use also Synfire Pro for prototype since several years.
> It's wonderfull too!



I own Synfire Pro. I really love how it sparks my creativity.

However, Hexachords Orb Composer's stated functionality goes way beyond Synfire Pro. Take a look and let me know what you think.


----------



## Eric G (Jul 6, 2017)

DSmolken said:


> Then they're not thinking big enough and fun enough. When accused of trying to replace women, a certain sex doll manufacturer responded that it's their goal to replace all humans.



Trust me, if you read around their web sites, the subtext is clear. They want to eventually replace composers.

They simply don't want to tick off composers right out of the gate. Someone has to be around to push the button


----------



## Alex Fraser (Jul 6, 2017)

Thanks for this post. Definitely an interesting discussion.

Some random thoughts:

The music demos sound, to put it bluntly, like a composer who is just starting out. Which I guess is what the AI is anyway.

Slight more perplexing for me: Is it really worth all the investment and man hours to tackle a market (royalty free music) which is already saturated, where there is no shortage of composers, and where the prices are already driven down? It kind of seems like a solution to a problem that doesn't need solving?
I assume the attraction would be to "edit" and "direct" the AI composition. Maybe that's the key.

Strikes me that if this sort of AI takes off, then as composers we'll have to keep one step ahead and write better, complex music.


----------



## Bohrium (Jul 6, 2017)

Eric G said:


> I own Synfire Pro. I really love how it sparks my creativity.
> 
> However, Hexachords Orb Composer's stated functionality goes way beyond Synfire Pro. Take a look and let me know what you think.



Well, talk is cheap ... I suppose.
I think we can't really compare until we actually have Orb Composer. Do you have a copy?
Synfire is here now, and a few years back it had a lot of features that were unusable, too ... and it cannot replace anything in the composers workflow. It just makes it easier to get from A to B ... like changing a chord or the whole progression in certain places ... etc. It's just a tool ... and there is very little AI there. (even though there is some in the analysis of MIDI)


----------



## DSmolken (Jul 6, 2017)

Alex Fraser said:


> Is it really worth all the investment and man hours to tackle a market (royalty free music) which is already saturated, where there is no shortage of composers, and where the prices are already driven down? It kind of seems like a solution to a problem that doesn't need solving?


Yeah, but it's kinda fun and interesting (to some people), and if it improves to the point that it becomes commercially viable, you'll be years ahead of everybody else.

I mean, why did I make a female virtual death metal vocalist?


Eric G said:


> Trust me, if you read around their web sites, the subtext is clear. They want to eventually replace composers.
> 
> They simply don't want to tick off composers right out of the gate. Someone has to be around to push the button


Well, I suppose "we want to replace all humans, and we're starting with composers because they're the easiest" might rub some people the wrong way...


----------



## BabyGiraffe (Jul 6, 2017)

Eric G said:


> Trust me, if you read around their web sites, the subtext is clear. They want to eventually replace composers.
> 
> They simply don't want to tick off composers right out of the gate. Someone has to be around to push the button


Well, composers can also become "filmmakers" with all the AI gadgets. The visual technologies are way more developed (imo) and there is more interest in them. The mathematical side of music (at least 12et, but you can find interesting articles on non-Western music at JSTOR etc) is explored a long time ago.
It's a matter of time DAWs to start impementing auto-composing and harmonizing (probably based on styles), melody writing (vectors?), orchestration (ircam already has something in this vein) tools.


----------



## Shad0wLandsUK (Jul 6, 2017)

ctsai89 said:


> I don't think there's any need to worry about AI replacing composers. Because by the time AI could actually truly write monumental symphonies or real art music (can it be called real art if it was done by an AI?) AI would have already replaced every other jobs of other people in every professions. And agreed with @Polkasound artificial is still artificial. No matter how good/cool a machine could write a piece of music, I wouldn't call that piece "art". Art works are humanly creations


I would not be so confident there, AI is much farther ahead than people think if you have been reading into it. These things are kept under wraps, until they are ready to be unveiled...and they have been working on this stuff for I would say at least 10 years now. Why do you think all of a sudden there is a surge of robots and intelligence in the market.

All electronic devices are connected, so you are looking at AI already having machine learning for something like 4 billions peoples, likes dislikes, preferences, facial expressions, locations, languages etc...people forget data is mined by the second.


----------



## wilberton (Jul 6, 2017)

It's certainly an interesting idea. While I don't think we'll be seeing computers suddenly start writing symphonic masterpieces, I can definitely imagine a system where you say I need 30 seconds of music that evokes emotion X, with edit points at 15" and 21". That could then generate you 5 pieces to choose from, pick one and it generates 5 variations on that version, etc, etc. In fact I could probably program that right now given a copy of Omnisphere and a scriptable DAW!
It's not going to put Hollywood film composers out of a job, but it might become a viable option for TV on a tight schedule, or lower budget games.


----------



## dpasdernick (Jul 6, 2017)

I heard of a book that was written about the future of automation and its effect on societies. I can't remember the author or the book title but he basically said there will be a generation of "useless people" coming soon. People that would have had lower skilled jobs that will be replaced by automation. Perhaps these people will be given a yearly wage by the government to just sustain themselves. I think this type of life would be maddening for most people. Enough money to "survive" but note enough money or incentive to "thrive". You don't even have to get out of bed to get down to the fast food restaurant to flip burgers. It's a frightening outlook.

As far as AI and composing go no robot is going to f*ck up an Am chord like I can so bring it on Terminator. I'm the king of "artificial intelligence"


----------



## SoNowWhat? (Jul 6, 2017)

This is a really interesting discussion.
There are many jobs already affected by mechanisation and as AI develops further it will affect more and more people. I'm just as interested in finding out what everyone will do when there are limited opportunities for employment available. That could be a great opportunity.

edit - You beat me to it @dpasdernick . This is precisely what I was getting at. How will we engage our minds in a future with limited employment? I'm sure I heard a radio interview with an author of a book just as you describe.

I'm looking for a copy of Warnings by Richard A Clarke & R.P. Eddy right now. I'm not sure if this sort of discussion would fit into that book but, it might. Especially if someone is looking at potential flow-on effects, and I'm sure someone is.


----------



## Quasar (Jul 6, 2017)

I don't think the AI tech is as significant as the cultural shift toward "music" as a commodity, its use in the service of selling other commodities, and the ever more sophisticated exploitation of scientific understanding of how tone and rhythm affects the brain so that the music becomes nothing more than a tool to be deployed as a means to achieve another end.

A car commercial track on TV sounds, well, like a car commercial, because the "artist" doesn't have anything to say other than "buy this car". Whether this is generated by a human being or AI is of no concern to me.

The reason those fast food burger joints remain popular despite being unhealthy is that we now have so much empirically derived evidence about the chemistry behind what people crave: Sugar, fat, salt & starch. Bundle these together in a competently balanced package and millions of people will have "Big Mac attacks" or whatever. It's the same with standardized pop rock I–vi–IV–V et al progressions that are fashioned formulaically for making a quick buck, exploiting the known and predictable reactions people tend to have when their ears are exposed to these patterns. No reason an algorithm can't do that as well or better than a human being.

Music as product can and likely will be supplanted by AI in the Information Age, just as factory workers were replaced by machines in the late Industrial Age.

But music as art? Never.


----------



## X-Bassist (Jul 6, 2017)

Replicant said:


> That will be replicated too.
> 
> There is already an AI which can paint pictures that absolutely nails the style of revered artists from the renaissance.
> 
> One day, when the technology advances, AI would be able to reflect on all its experience and create something "unique" in the same way that all of your influences lead to you creating something different.



Seems many people underestimate human creativity, ideas, and personal style. It is not simply the sum of experiences or a process that can be duplicated. Some days I will pick one horn for a solo, another day a woodwind, another day a synth. It's not the song or purely my past experiences that changes my mind, it's how I'm feeling at that moment and how I want to express that feeling which is effected by a thousand things, includig the random dreams I had last night (which are themselves dependent on thousands of things). So people here are thinking there will be a program that will factor in all those things then add a "style" that will work well? And one that other humans will find compelling? I think that is so difficult for a computer to do that it is far in the future at best. Yes, they can write music, even music that can be sold. But don't assume it's easy to make music compelling, that it would touch us all on an emotional level. A great composer can do this, I have yet to hear a computer generated piece that can. So are you looking to make background tracks that just make money? Or do you want to touch people emotionally? Because there are easier ways to make money if that's all someone wants.

Many brilliant musical humans have studied great composers of the past and every score they've done, yet have not been able to duplicate that great composers success. So you're saying some computer programming expert with less musical knowledge is going to program a computer that will "break that code"? I'm suggesting there is no code. Like with great stories, once you put music into purely a formula, it doesn't work anymore. Witness the long line of bad Hollywood films.


----------



## Eric G (Jul 6, 2017)

Bohrium said:


> Well, talk is cheap ... I suppose.
> I think we can't really compare until we actually have Orb Composer. Do you have a copy?
> Synfire is here now, and a few years back it had a lot of features that were unusable, too ... and it cannot replace anything in the composers workflow. It just makes it easier to get from A to B ... like changing a chord or the whole progression in certain places ... etc. It's just a tool ... and there is very little AI there. (even though there is some in the analysis of MIDI)



LOL. Agreed and I don't have copy yet. But I am signed up for the Beta out this fall. I will let everyone know once I get my hands on it.

And I do love Synfire as an musical idea development tool.


----------



## HiEnergy (Jul 6, 2017)

dpasdernick said:


> I heard of a book that was written about the future of automation and its effect on societies...


This is what comes to mind:


----------



## ghobii (Jul 6, 2017)

I just signed up for the Orb beta, and found it rather amusing I had to check a box stating "I am not a robot".


----------



## Johann F. (Jul 6, 2017)

I was actually surprised to hear Hexachord's http://www.hexachords.com/listen/ (latest demos). If it's 100% AI, then it created some nice melodies here and there. AI music usually sounds like diarrhea to my ears.

I see this appealing to a lot of cheap indie game developers. And by cheap I mean those 14yo buggers who watched a few YT tutorial on Unity and want to pay you in Pokemon cards.


----------



## AlexRuger (Jul 6, 2017)

X-Bassist said:


> Like with great stories, once you put music into purely a formula, it doesn't work anymore. Witness the long line of bad Hollywood films.



Witness the long line of bad Hollywood films that sell like hotcakes. 

It's not a matter of making music that's good. It's a matter of making music that is good _enough, _which is what I'm afraid of.


----------



## Puzzlefactory (Jul 6, 2017)

I would be worried about AI personally. 

It may not not replace all the big stuff like Hollywood feature films or GoT style TV shows but all the little things like TV, adverts, Trailers, Corporate videos etc etc. Library music stufff essentially. 

That could all be replaced IMO with a sophisticated enough algorithm. 

After all, computers follow rules and music follows rules.


----------



## Eric G (Jul 6, 2017)

Johann F. said:


> I was actually surprised to hear Hexachord's http://www.hexachords.com/listen/ (latest demos). If it's 100% AI, then it created some nice melodies here and there. AI music usually sounds like diarrhea to my ears.
> 
> I see this appealing to a lot of cheap indie game developers. And by cheap I mean those 14yo buggers who watched a few YT tutorial on Unity and want to pay you in Pokemon cards.



What's really interesting is that they allow the "composer" to import MIDI, modify the structure of the song etc as a starting point and it integrates with DAWS and hence VSTs allowing modern composers to use the dozens of sample libraries they own. I know I don't want to throw out my investments.


----------



## Eric G (Jul 6, 2017)

HiEnergy said:


> This is what comes to mind:




LOL. Wow. I guess I am a wannabe "Creative Snowflake" according to the video. Good post. Thx


----------



## AlexanderSchiborr (Jul 6, 2017)

But a computer isn´t able to understand an "emotional reaction" like humans do imo. Sure he can follow rules what humans programmed him to do or to react somehow, but that doesn´t really mean he understands the drama and feel it, his algorithm programmed by humans tell him what to do. But here comes the problem: While an algorithm can cover several situations maybe, when you just change one variable the computer ends up in an error. It is just impossible to inject this into him, because enjoyment, fear, love etc is something irrational and this exactly is system immanent to the nature of a computer. And yes I did listen to some music generated by AI. And while some of the examples are not bad at all, to put it mildly, I would say: Just change one parameter and the machine doesn´t know how to deal with the different emotional situation whilst a human beeing has the ability to generate an emotional real response to adapt to the new situation. You know guys: That is what humans are all about..our brain is so much more complex than any supercomputer in the world, we are no machines, and we don´t think in zero and ones, our imperfections and the ability of having a real emotional response sets us apart from a machine. And that is the whole point for me.


----------



## Puzzlefactory (Jul 6, 2017)

Maybe not completely independently. But add enough humanly editable variables to the program and I'm sure you could have an "add cinematic music" function to Final Cut or Premier. So the video editor could just add music him/herself without having to go through libraries or composers.


----------



## Eric G (Jul 6, 2017)

Puzzlefactory said:


> Maybe not completely independently. But add enough humanly editable variables to the program and I'm sure you could have an "add cinematic music" function to Final Cut or Premier. So the video editor could just add music him/herself without having to go through libraries or composers.



You called it. Go to the bottom of https://www.ampermusic.com/

There is a Premier Plugin that does just that called the Adobe® Premiere Pro® Amper Panel.

Not saying that Amper is good enough at all but there is $5M in funding that means they are going to be around for a while trying to make it happen.


----------



## Eric G (Jul 6, 2017)

Eric G said:


> You called it. Go to the bottom of https://www.ampermusic.com/
> 
> There is a Premier Plugin that does just that called the Adobe® Premiere Pro® Amper Panel.
> 
> Not saying that Amper is good enough at all but there is $5M in funding that means they are going to be around for a while trying to make it happen.



From the download Page:
Instantly create the perfect soundtrack to your video in a fraction of the time. No more searching lists of stock music. Your music is uniquely crafted with no risk of it being used by someone else.

Amper provides you with a global, perpetual use, and royalty free license, with no conditional or unexpected financial expenses. Built by Hollywood composers, sound designers, and leading music technologists Amper composes and performs each note using professionally played and recorded samples. No loops or pre-composed musical segments.


----------



## Johann F. (Jul 6, 2017)

Eric G said:


> You called it. Go to the bottom of https://www.ampermusic.com/
> 
> There is a Premier Plugin that does just that called the Adobe® Premiere Pro® Amper Panel.
> 
> Not saying that Amper is good enough at all but there is $5M in funding that means they are going to be around for a while trying to make it happen.



HA now that's funny!

I found Amper to be as boring and generic as the next toyish Casio arranger keyboard.

Orb sounds way more serious. The MIDI thing is a nice feature. I work on some really bad soap operas from time to time and who knows, maybe I'll let Orb do all the labor while I enjoy the beach. I bet producers won't even notice the difference - and that speaks volumes about my work HA!


----------



## Eric G (Jul 6, 2017)

Johann F. said:


> HA now that's funny!
> 
> I found Amper to be as boring and generic as the next toyish Casio arranger keyboard.
> 
> Orb sounds way more serious. The MIDI thing is a nice feature. I work on some really bad soap operas from time to time and who knows, maybe I'll let Orb do all the labor while I enjoy the beach. I bet producers won't even notice the difference - and that speaks volumes about my work HA!



LOL. Agreed about Amper. Not impressed. We will have to see about Orb.


----------



## bigcat1969 (Jul 6, 2017)

Forgive me going a bit OT. Based on folks earlier comments about SynFire I'm trying the demo for Harmony Navigator which is the cheaper, non-ilock needing songwriter helper from Cognitone. I am 112% lost. I can't get any sound, I don't know what I'm doing. Well of course that isn't that unusual...


----------



## DSmolken (Jul 6, 2017)

Polkasound said:


> As AI advances, its influence will expand. It will push more and more composers out of the business, and the ones who will survive will be those who are talented and creative enough to embellish upon the AI's creation while still remaining competitive.


Also a few with the knowledge of music theory and programming to work on the algorithms.


----------



## Puzzlefactory (Jul 6, 2017)

A new type of musician will probably evolve that's adept at manipulating the algorithms. 

And just as dance producers and dj's were scorned by acoustic musicians, so to will this new breed be told "you're not really musicians, the machines doing it all"...


----------



## Alatar (Jul 6, 2017)

Eric G said:


> Over the last 12 months, I have noticed several major technology developments being made in artificial intelligence directed at the music industry. Particularly with music composition.
> 
> AmperMusic
> https://www.ampermusic.com/
> ...



There is at least one more AI-music startup. This one is London-based:
https://www.jukedeck.com/
https://soundcloud.com/jukedeck


----------



## hummingbird (Jul 6, 2017)

It's all about bottom line. Right now there is a market for production music for television, cable, advertising, etc. Imagine an advertising company who only needs an AI composer to put together its background music. No salary to pay, no sync to pay, no cue sheets to file. They own the machine and they own what it writes. We may think it's not good enough, but if an AI can create the cheery tune that underscores the paper towel commercial, that's a composer out of work. Multiply that by HBO, cable, reality tv, and I can envision the market for 'real music' getting smaller and smaller. One day. Maybe. The sky is not falling yet.

I find it interesting to contemplate that an industry that consistently and urgently insists that Virtual Instruments be of such quality and be used in such a meticulous way as to be indistinguishable from human-played instruments, would then accept compositions from a machine and deem them worthy


----------



## Eric G (Jul 6, 2017)

Alatar said:


> There is at least one more AI-music startup. This one is London-based:
> https://www.jukedeck.com/
> https://soundcloud.com/jukedeck



Yeah. I didn't include them because there is no public Product in public Beta and no announcement of when there will be one.


----------



## Rohann (Jul 6, 2017)

What I can never grasp is why tech developers seem incapable of philosophical thought. Musk is one of the few people that seem genuinely motivated by the desire to forward humanity, rather than simply make money.

I'll never understand the obsession with making AI replicate unique and beautiful human expression. Even if AI somehow does end up being able to create convincingly generic music (i.e. a generic paper towel jingle, no offense to jingle creators), or write movie scripts and create digital films, etc, I think eventually the novelty will wear out and people will stop caring.
Who cares if a machine lifts 500lbs? We still care when a human does it. In fact, we televise competitions about it.


----------



## X-Bassist (Jul 6, 2017)

AlexRuger said:


> Witness the long line of bad Hollywood films that sell like hotcakes.
> 
> It's not a matter of making music that's good. It's a matter of making music that is good _enough, _which is what I'm afraid of.



I hear you. Like I said, if it's just about the money, there are easier ways to make money. There are still producers and viiewers that know the difference between John Williams and generic trailer music, they both make money, but one is an art and the other just a monetary transaction. If your fine with just the latter, then so be it. But you won't be on your deathbed saying "I just wish I made more money!".  Effecting people emotionally and have true fans of your art is much more than just money, and much more satisfying. That's why Williams worked so hard at it.


----------



## Rasmus Hartvig (Jul 6, 2017)

Most modern blockbuster soundtracks already sounds like they were written by a machine, so if we just lower the standards a little bit more, the AI will be fine. Brian Tyler might have to buy a smaller house though...


----------



## Replicant (Jul 6, 2017)

AlexanderSchiborr said:


> But a computer isn´t able to understand an "emotional reaction" like humans do imo. Sure he can follow rules what humans programmed him to do or to react somehow, but that doesn´t really mean he understands the drama and feel it, his algorithm programmed by humans tell him what to do. But here comes the problem: While an algorithm can cover several situations maybe, when you just change one variable the computer ends up in an error. It is just impossible to inject this into him, because enjoyment, fear, love etc is something irrational and this exactly is system immanent to the nature of a computer. And yes I did listen to some music generated by AI. And while some of the examples are not bad at all, to put it mildly, I would say: Just change one parameter and the machine doesn´t know how to deal with the different emotional situation whilst a human beeing has the ability to generate an emotional real response to adapt to the new situation. You know guys: That is what humans are all about..our brain is so much more complex than any supercomputer in the world, we are no machines, and we don´t think in zero and ones, our imperfections and the ability of having a real emotional response sets us apart from a machine. And that is the whole point for me.



This whole discussion reminds me of how Vangelis sees music more as science than art and I tend to agree.

We are machines — just an organic one.

We often associate something like the Lydian mode with flying, romance or happiness. It is inevitable that an AI will be able to figure this out based on pieces written with that mode, their titles and use in film. So if you tell it "compose music that makes me feel like I'm flying", it would give you that — it's essentially associating tags with composition techniques just like a director explaining what they want to you.

Lastly, a melody or piece doesn't require "emotion", which is difficult to objectively measure anyway, to be good. Like everything else in music that sounds good, there is a logical reason why it sounds that way and it can be explained with theory and broken down into an algorithm.

and if there is one thing AI and machines are really good at — it's logical algorithms.


----------



## Rohann (Jul 6, 2017)

X-Bassist said:


> I hear you. Like I said, if it's just about the money, there are easier ways to make money. There are still producers and viiewers that know the difference between John Williams and generic trailer music, they both make money, but one is an art and the other just a monetary transaction. If your fine with just the latter, then so be it. But you won't be on your deathbed saying "I just wish I made more money!".  Effecting people emotionally and have true fans of your art is much more than just money, and much more satisfying. That's why Williams worked so hard at it.


Certainly true. I just hope the mainstream eventually appreciates music and art more than they do now, rather than less. We're certainly in the age of CG and effects coming before story and characters, though I'm glad to see how well films like Arrival and Logan did.


----------



## Johann F. (Jul 6, 2017)

Rohann said:


> I'll never understand the obsession with making AI replicate unique and beautiful human expression. Even if AI somehow does end up being able to create convincingly generic music (i.e. a generic paper towel jingle, no offense to jingle creators), or write movie scripts and create digital films, etc, I think eventually the novelty will wear out and people will stop caring.
> Who cares if a machine lifts 500lbs? We still care when a human does it. In fact, we televise competitions about it.



Dude that was awesome. Hooray for humans!


----------



## Rohann (Jul 6, 2017)

Replicant said:


> This whole discussion reminds me of how Vangelis sees music more as science than art and I tend to agree.
> 
> *We are machines — just an organic one.*


Ooh, that's quite the statement. There are plenty of significant philosophers and neuroscientists that would disagree with a mechanistic/deterministic view of the human mind. I'd suggest Edward Feser's introduction to Mind on that one.

Also, separating science from art is something I think is often unnecessarily done. There's plenty of art to be found in science (considering the frequent misapplication and use of the word, and the complexity of interpreting and generalizing data, for instance), as well as vice versa.
In music, there are of course systems and components in place (scales, notes based on specific pitch, etc), just as there are in painting (colours, brushes, paper types, paints with varying viscosity). The end result is greater than the sum of the parts, however. What makes a painting special isn't in the mechanisms themselves, it's in the meaning and intent in combination with the result of the mechanisms.



> We often associate something like the Lydian mode with flying, romance or happiness. It is inevitable that an AI will be able to figure this out based on pieces written with that mode, their titles and use in film. So if you tell it "compose music that makes me feel like I'm flying", it would give you that — it's essentially associating tags with composition techniques just like a director explaining what they want to you.
> 
> Lastly, a melody or piece doesn't require "emotion", which is difficult to objectively measure anyway, to be good. Like everything else in music that sounds good, there is a logical reason why it sounds that way and it can be explained with theory and broken down into an algorithm.
> 
> and if there is one thing AI and machines are really good at — it's logical algorithms.


But emotion drives performance, melody creation etc. It's hard to ignore that, at least from an artistic standpoint, many people don't care about something not written or created by another person. There's nothing to personally connect with.
The only truly meaningful objective measurement to the efficacy of a piece is how it affects people emotionally.


----------



## X-Bassist (Jul 6, 2017)

Rasmus Hartvig said:


> Most modern blockbuster soundtracks already sounds like they were written by a machine, so if we just lower the standards a little bit more, the AI will be fine. Brian Tyler might have to buy a smaller house though...



Working in Hollywood the last 25 years, I have seen many executives fall for "the music and sound design doesn't matter that much" idea. But these are also the same people that complain later there is not enough support for films and that people don't watch (and rewatch) like they use to. The less support the story has (which hopefully is worthwhile) the less viewers will be engaged and want to support the film. I've worked on films that were so slow and painful to watch with only dialog, but when great themes and great sound design are mixed in, it became something much more. When generic music plays out (happening now) and more producers realize how much a thoughful score helps (compared to time and expense of reshoots and visual fx) they will also realize their picture editors (with or without a generated music library) don't have the time or expertise to make this kind of connection happen. Library music has been around for a long time, but there are still very few films without a hard working composer. Why the composers are writing more generically, is another question all together (perhaps to throw the analysis apps off the mark?)


----------



## Fer (Jul 6, 2017)

Totally agree with you on this @AlexanderSchiborr .

Actual chess software can beat Magnus Carlsen without problems; they play moves that are simply not human, and that are based on raw power calculation. I readed an article recently explaining how these algorithms works. Chess software is actually “making decisions” between different moves in a mechanical way: software can tag every possible move with a number. The sofware will choose the A movement over B because:

-the A value is 5 and the B value is 2. -and because 5 is more than 2. -And because it is designed to choose the hightest value.

The engenieer has inteligently established a set of rules to assign the highter number to the most powerful movement. But the software doesnt know what a number is or that five is more than two or that the queen is much more important than the pawn; is the engenieer who knows it. If he defines that 2 is more than 5 then the software will asume it without problems and will loose against Carlsen in an instant. So at the end there is only a dead machine working as excepted.

But music is imo something different. Creating music is about taking real decisions during the process that are implying your consciousness, your emotional perceptions, your knowledge, your goals… A software making music is forced to take “decisions” also. But how? I cant imagine a software perceiving by itself that “this is beautiful than this”. But i can imagine a software with defined instructions like: “if a sudden expectation is needed go from the dominant chord to the bVI chord instead of going to the tonic”. I think that in a future perhaps this could work for making correct/generic music, which eventually could be replacing some composers jobs. But not for making something totally beyond the defined rules of the sofware.

Another thing; if there is someone out there that can design a mechanical way to measure the beauty of a melody that eventualy could be encoded into an algorithm…guess what? Those are the musicians and this site is full of them…and we know that there are not mechanical rules that once aplied will automatically produce a beautiful melody.


----------



## VinRice (Jul 6, 2017)

Music is maths. Conceptually there is no reason whatsoever why AI couldn't be extremely useful for the 'grunt' work that I am sure everybody here does everyday. Chord sequence, harmonisation, meter, timing etc etc. However the more you learn about music the more you realise the infinite variations that are possible as those little musical decisions mount up. I'm very happy for something like Orb to resolve a chord sequence for me at the right tempo to hit 01.04.33. There is no way in my lifetime that it is going to know to add that little triplet stutter in the melody that makes the whole thing come alive. There is no way that it is going to know that the director is obsessed with cimbalom and how to construct complimentary instrumentation. There is no way it is going to know the latest underground dance floor meme that Michael Bay will completely loose his sh*t over. 

Will there be automated soundtracks for dross TV shows and YouTube channels? Of course! Technology will always remove the jobs where there is a financial incentive to favour output over quality. Who wants those jobs? If you treat music like a production line (yes I know, I know) technology will eventually 'productionize' it. Just like every other industry in the world.

Change is inevitable and it's pointless bitching about it. Embrace it, stay ahead of the curve and use it to your advantage. I really hope something like Orb works so that the distance between the musical ideas in my head and finally fitting them to a cue is reduced. It still think it is some years away however. The demos on the Orb site are absolute garbage.


----------



## DSmolken (Jul 6, 2017)

VinRice said:


> There is no way it is going to know the latest underground dance floor meme that Michael Bay will completely loose his sh*t over.


Ha, as an occasional DJ I've sometimes stood there picking the next track thinking that this is something software will soon be a lot better at than us humans - knowing the latest trends before they get big, and predicting what track the specific audience that specific evening will react positively to. That's just something that's pretty hard for us, and much easier for computers.


----------



## Rohann (Jul 6, 2017)

Fer said:


> Totally agree with you on this @AlexanderSchiborr .
> 
> Actual chess software can beat Magnus Carlsen without problems; they play moves that are simply not human, and that are based on raw power calculation. I readed an article recently explaining how these algorithms works. Chess software is actually “making decisions” between different moves in a mechanical way: software can tag every possible move with a number. The sofware will choose the A movement over B because:
> 
> ...


Well said. I'm reminded of Searle's "Chinese Room" argument re: the inability of AI to actually understand rather than act as input/output.


----------



## VinRice (Jul 6, 2017)

DSmolken said:


> knowing the latest trends before they get big, and predicting what track the specific audience that specific evening will react positively to. That's just something that's pretty hard for us, and much easier for computers.



Really? You think a computer will be able to pick up 'the vibe' in a room before an experienced human? I think that's unlikely. (eight years at the coal face in London's nightclubs)


----------



## Rohann (Jul 6, 2017)

VinRice said:


> Music is maths. Conceptually there is no reason whatsoever why AI couldn't be extremely useful for the 'grunt' work that I am sure everybody here does everyday. Chord sequence, harmonisation, meter, timing etc etc. However the more you learn about music the more you realise the infinite variations that are possible as those little musical decisions mount up. I'm very happy for something like Orb to resolve a chord sequence for me at the right tempo to hit 01.04.33. There is no way in my lifetime that it is going to know to add that little triplet stutter in the melody that makes the whole thing come alive. There is no way that it is going to know that the director is obsessed with cimbalom and how to construct complimentary instrumentation. There is no way it is going to know the latest underground dance floor meme that Michael Bay will completely loose his sh*t over.
> 
> Will there be automated soundtracks for dross TV shows and YouTube channels? Of course! Technology will always remove the jobs where there is a financial incentive to favour output over quality. Who wants those jobs? If you treat music like a production line (yes I know, I know) technology will eventually 'productionize' it. Just like every other industry in the world.
> 
> Change is inevitable and it's pointless bitching about it. Embrace it, stay ahead of the curve and use it to your advantage. I really hope something like Orb works so that the distance between the musical ideas in my head and finally fitting them to a cue is reduced. It still think it is some years away however. The demos on the Orb site are absolute garbage.


Good points. The key is using it in an innovative way and not letting convenience take the forefront of the actual writing.

As an aside, despite my own desire to be able to make convincing music without musicians, my end goal would always be to record as much playing as I'm able. I don't think I'm unique in this regard.



VinRice said:


> Really? You think a computer will be able to pick up 'the vibe' in a room before an experienced human? I think that's unlikely. (eight years at the coal face in London's nightclubs)


Indeed, I think this type of thinking is a deification of AI that reveals a lack of understanding about AI's capabilities. One can dream, sure, but predict?


----------



## VinRice (Jul 6, 2017)

VinRice said:


> Really? You think a computer will be able to pick up 'the vibe' in a room before an experienced human? I think that's unlikely. (eight years at the coal face in London's nightclubs)


 (Replying to my own post - I'm clearly insane.)

Having said that - in the far-future I can imagine everybody in a nightclub being tagged, having their blood chemistry analysed, their personal computing implants interrogated and the aggregate 'vibe' being analysed and compared to historical data to provide a list of sound track options for the operator depending on the 'mood-direction' required. Of course the only nightclubs left by that time will be in the badlands of the Jupiter mining colonies. 

I'll stop now...


----------



## VinRice (Jul 6, 2017)

Rohann said:


> As an aside, despite my own desire to be able to make convincing music without musicians, my end goal would always be to record as much playing as I'm able. I don't think I'm unique in this regard.



Couldn't agree more. We react in such deep ways to actual musicianship that I don't think will be going away any time soon.


----------



## Rohann (Jul 6, 2017)

VinRice said:


> Couldn't agree more. We react in such deep ways to actual musicianship that I don't think will be going away any time soon.


Indeed!



VinRice said:


> (Replying to my own post - I'm clearly insane.)
> 
> Having said that - in the far-future I can imagine everybody in a nightclub being tagged, having their blood chemistry analysed, their personal computing implants interrogated and the aggregate 'vibe' being analysed and compared to historical data to provide a list of sound track options for the operator depending on the 'mood-direction' required. Of course the only nightclubs left by that time will be in the badlands of the Jupiter mining colonies.
> 
> I'll stop now...


Of course, the implants will have been implicated in the development of rapidly advancing neurodegenerative disorders, but UNITECH will deny any such connection.


----------



## VinRice (Jul 6, 2017)

Most historians agree that this was the inciting incident for the great Jupiter uprisings and it's eventual independence.


----------



## VinRice (Jul 6, 2017)

..and the eventual outlawing of invasive brain tech in most colonies...


----------



## Uncle Peter (Jul 6, 2017)

The savvy music library is going to copy Amper Music and the like straightaway. There is definitely a place for this alongside the human compositions.

..... once the compositions are better. They sound crud at the moment


----------



## ChristianM (Jul 6, 2017)

Do not worry about the AI, the path that was chosen (40 years ago) is the one that imitates, not the one who invents.
This path has been chosen because it gives quick results, but it is not really intelligence.

To illustrate this:
"Victor tells a funny story to a monkey, the monkey does not understand anything and does not react.
Victor's brother enters the room and tells him the same funny story. The brother burst out laughing. The monkey imitates the brother and also laughs ... "

The AI as imagined by John McCarthy will probably never see the light of day ...

But in music, harmony, counterpoint, etc. are achievable because they are "techniques" ...


----------



## Replicant (Jul 6, 2017)

Rohann said:


> But emotion drives performance, melody creation etc. It's hard to ignore that, at least from an artistic standpoint, many people don't care about something not written or created by another person. There's nothing to personally connect with.
> The only truly meaningful objective measurement to the efficacy of a piece is how it affects people emotionally.



There are two big, incorrect assumptions people always make regarding this topic.

1) The idea that an AI could not replicate pieces that create subjective feelings (Emotion) in humans because the machine itself lacks this capability, but the emotion you describe is not an objective experience of music.

2) The idea that people actually care about whether the "art" they enjoy is created by man or machine rather than caring about the quality of the piece and its appeal to the viewer.

To the first one, I would argue that emotion doesn't really drive melody creation. That's no doubt a controversial statement on forums of this area of interest, but it's the truth. Melody is the result of a linear succession of tones and every pleasing tune ever composed consists of these tones following particular rules of motion (and subversion of these expectations) repetition and variance, chord and non-chord tones to create these melodies and these rules are being followed whether the composer is consciously aware of it or not.

This is absolutely something an AI could learn and would be good at; all it needs to do is be fed these rules and examine tons of pieces, which it could do in seconds.

It's just a matter of time before an AI can easily master all of these concepts and generate them on its own. There is already notation software that can spot errors in voice-leading. Apply this as an auto correct feature, the basic "rules" of melody and harmony, have the AI compose multiple melodies at once and boom — counterpoint.

As for point two, while it certainly has its detractors for being "computer music", are we all just going to pretend EDM, trailer music, etc. doesn't have a positively massive fanbase? 99% of trance music is exactly the same and an AI should have no trouble creating if someone was truly up the task of making an AI that could do it. The number of people who are bothered by sampled music or entirely synthetic pieces are statistically insignificant.

Nobody cares if the music they're listening to on spotify or in the theater is made with samples or synthesizers and they're not going to care if it was composed by a computer too — they care that they like it.

John Williams-level compositions at the click of a button are coming in the next decades whether we like it or not. 

Art is not safe from human laziness allowing humans to be replaced entirely.


----------



## DSmolken (Jul 6, 2017)

VinRice said:


> Really? You think a computer will be able to pick up 'the vibe' in a room before an experienced human? I think that's unlikely. (eight years at the coal face in London's nightclubs)


Really. I just got to thinking about that one night, what am I really doing and how I could write requirements for software that could do the job better. It would not be trivial, but...

The number of people on the dance floor and their energy level (plus stuff like age, sex etc.) is pretty easy data to acquire, though things like how fashionably they're dressed is probably a bit harder, and I know that also goes into my thinking. Sure, I can still do that part better, even though a machine would also have an easier time getting current data on drink sales, where the actual money is. But, let's say, for just the picking up the vibe of the room part, I'm probably still better overall than any currently feasible software I can imagine.

But that's one room, which is a big limitation. A machine could access historical data, let's say up to last night, or even up to the minute, from this club and clubs all across the world, find clubs with similar profiles, and make a list of playlist suggestions. That I can't do and will never be able to do - the closest I can do is remember what worked at the same club earlier, talk to other DJs who played there, and check charts. That's not even close.

The last step is to put the first two together and mix tracks from those suggestions based on what's currently happening in the room, which, well, I'm pretty good at doing that on the fly, but again, something with data from clubs all over the world could do better.

But my day job is administering tools used by software developers. So I tend to think about this kind of stuff.


----------



## Puzzlefactory (Jul 6, 2017)

The whole system of automation will decimate the job market all over the place (as I've mentioned in another thread).

A landlord of a pub will stop hiring a dj because of a program that'll pump out original dance music all night. He'll be happy with the money he's saved until he gets an email from the brewery telling him they don't need him any more because they'll be installing self service drinks dispensers at the the bar and an automated barrel changing system in the cellar.

All the managers at the brewery will be pleased with themselves of the savings they've made for the company, until the shareholders contact them saying that their services are no longer needed as there's a computer algorithm that can make middle management decisions with more efficiency and a lower margin of error than any human.

While they're cleaning out their offices the warehouse staff are being informed that they're being replaced by robotic picking/packing machines and the truck drivers are being let off in favour of driverless lorries.

Obviously all this won't happen over night, but whole industries will be affected simultaneously and there's nothing really to fill the gap.

It's a real problem that's not given enough serious thought IMO. Music composition is just one small part of it...


----------



## Rohann (Jul 6, 2017)

Replicant said:


> There are two big, incorrect assumptions people always make regarding this topic.
> 
> 1) The idea that an AI could not replicate pieces that create subjective feelings (Emotion) in humans because the machine itself lacks this capability, but the emotion you describe is not an objective experience of music.
> 
> ...


Sure. The mechanisms of a melody are largely rule-based, and what tends to be memorable and pleasing to the human ear can be said to follow certain predictable patterns. What I mean by emotion-driven is the idea of "what is the colour red" vs "what is it _like_ to see the colour red" -- the context of the melody may be lost on many, but it's certainly not unimportant.



> As for point two, while it certainly has its detractors for being "computer music", are we all just going to pretend EDM, trailer music, etc. doesn't have a positively massive fanbase? 99% of trance music is exactly the same and an AI should have no trouble creating if someone was truly up the task of making an AI that could do it. The number of people who are bothered by sampled music or entirely synthetic pieces are statistically insignificant.


True, but despite it having a fan base, I would hardly say any of it is worthy of the label of "artistically significant", or even interesting. The manipulation and context is what makes Amon Tobin's music considerably more interesting than random samples thrown together or the run-of-the-mill mashup DJ.



> Nobody cares if the music they're listening to on spotify or in the theater is made with samples or synthesizers and they're not going to care if it was composed by a computer too — they care that they like it.


You make some good points, but I disagree with this. As for sound source, I think it's less important than what it sounds like to a listener. But as for _nobody _caring whether or not it was made by a person? Why will someone pay $6000 for a hand-made guitar when they can buy a production model of similar or better build quality, often with less small flaws, for significantly less? Why do we care about who can drive the fastest and most accurately, or lift the most weight, or shoot the straightest, or problem-solve the quickest, when machines have been able to analyze and do all of these things more competently than us for years? Why do people still attend live music events (arguing that more people spend time in clubs is a ridiculous point [not one I'm saying you're making, just in case it comes up]; a small fraction of the people in clubs even notice what music is playing, other than the deafening noise and underlying rhythm that's consistent and dance-able)? Why do people still care about painted vs digital art? Why do people still care about who the best chess player is, despite AI having beaten world-renowned players? Why do a significant amount of people care about live improvisation in music?

While I agree regarding AI being able to brute-force analyze music at ridiculous speeds, and being able to construct complex melodies and harmonies in the future (probably), I can't help but think that there will still be a significant place for human expression. There's absolutely a market for generic garbage, especially in music, but I think believing meaning is unimportant to humans is either a bias resulting from metaphysically assuming humans as purely mechanistic (an underlying assumption of this view being that humans make no connections to anything beyond deterministic neurochemistry), or it's a pessimistic view on human laziness (something I'm afraid of and unfortunately suspect myself). Of course, this isn't really touching the range of expression and interesting rulebreaking that happens in music, something much more difficult to replicate.

I also suspect many of these tech "innovations" are _far_ more difficult to seamlessly and consistently implement in real life. Consider even something like human language and how inaccurate translation AI is (at least that which is available to the public). It works fine for short phrases, but quickly loses sense of context or nuance. Even videogame AI is nowhere near up to where people would suspect it should be by now, despite processing power no longer being much of a hindrance. I think writing interesting music will be a difficult task, something far more difficult than a brute-force response-analysis that happens with, say, AI chess.



> Art is not safe from human laziness allowing humans to be replaced entirely.


This is something I personally dread. The irony is that the negative research always comes out after the invention (i.e. smartphones and the research that's come out, especially in regard to overall quality of life, neurological development in children, severe reductions in attention span and ability to focus in adults, etc).

But anyway, you're a replicant, so why would I listen to you? Do you dream of electric sheep?


----------



## Rohann (Jul 6, 2017)

DSmolken said:


> Really. I just got to thinking about that one night, what am I really doing and how I could write requirements for software that could do the job better. It would not be trivial, but...
> 
> The number of people on the dance floor and their energy level (plus stuff like age, sex etc.) is pretty easy data to acquire, though things like how fashionably they're dressed is probably a bit harder, and I know that also goes into my thinking. Sure, I can still do that part better, even though a machine would also have an easier time getting current data on drink sales, where the actual money is. But, let's say, for just the picking up the vibe of the room part, I'm probably still better overall than any currently feasible software I can imagine.
> 
> ...


Ok but to be fair, what, _really_, is the "vibe" range in a dance club? I've spent incredibly little time in clubs and I'm sure I could "analyze" the vibe in most nightclubs without going into them.


Puzzlefactory said:


> The whole system of automation will decimate the job market all over the place (as I've mentioned in another thread).
> 
> A landlord of a pub will stop hiring a dj because of a program that'll pump out original dance music all night. He'll be happy with the money he's saved until he gets an email from the brewery telling him they don't need him any more because they'll be installing self service drinks dispensers at the the bar and an automated barrel changing system in the cellar.
> 
> ...


It really isn't given enough thought, because for some reason people view technological invention as a moral imperative taking precident over the good of mankind. I really think Elon Musk is one of the very few tech geniuses that understands this. Even Jobs didn't allow his own products into his home -- what kind of person creates a household product they're afraid to let their family use?


----------



## Replicant (Jul 6, 2017)

Puzzlefactory said:


> The whole system of automation will decimate the job market all over the place (as I've mentioned in another thread).
> 
> A landlord of a pub will stop hiring a dj because of a program that'll pump out original dance music all night. He'll be happy with the money he's saved until he gets an email from the brewery telling him they don't need him any more because they'll be installing self service drinks dispensers at the the bar and an automated barrel changing system in the cellar.
> 
> ...



No matter how you slice it, this leads to a communist dystopia for us.

"We'll just have a universal basic income!"

Yeah, well you can only redistribute wealth so long as there is wealth being generated to distribute. Which you don't have under such a system which also discourages entrepreneurship and renders impossible anyway.

Civilization is all about work, without it we have no civilization. It's working to build things, generate an economy, work to maintain those buildings, economies and innovations and our day-to-day interactions within this system that make civilization what it is and separates us from the animals.

The only hope I can see is that CEOs and corporations will realize that having boosted productivity and automation doesn't matter if no one has any money to buy your products because they have no work and their UBI is drained by utility companies that now charge the most they possibly can when everyone has a fixed income. 

This happened at my local Wal-Marts. They used to have automated checkouts, but they got rid of them due to it killing jobs that gave people money to buy from them and, like I said in my first point, was actually no more "convenient" than an employee.

We can still efficiently produce most things we consume without fully automating them or having them run by HAL-9000s. 

If it comes down to having my Amazon order occasionally show up late or the kid at Wendy's screwing up my order from time to time vs living in dystopia of "convenience", I think I'll take that extra 10 minute wait in line, thanks.


----------



## jononotbono (Jul 6, 2017)

I think it's pretty obvious that when people use emulations of musical instruments, no matter how advanced they are and how fantastic they have started to sound, they are nothing like the real deal. Why do people think the human being can just be emulated? It's ridiculous. Music isn't maths. Music is emotion and no computer or algorithm feels emotion. People always want something new and exciting and the only thing capable of that is human chaos (the human being). You can't program that.


----------



## AlexRuger (Jul 6, 2017)

This discussion is overall missing the point, I think...

First, stop bringing up art. These technologies are all about disrupting _industries, _i.e. by definition they are focused on replacing _working _composers. Art has absolutely nothing to do with this discussion because it's not like people will stop writing music for fun/enjoyment/art just because an algorithm can get paid to do it. I should be clear, though, that by "art" I mean the intention of the music, i.e. it is not being made for any specific purpose beyond that of the artist's/artists' (whatever it may be) and is not being commissioned. We're talking about music that is made in exchange for money; that's the only thing these technologies would have any reason to target.

Second, AI is a bit of a misnomer here. Obviously whatever capabilities the algorithm has will be decided by a human being. Humans are the ones writing the code, after all, and machine learning is still quite young. This is far better classified as "automation" as opposed to "artificial intelligence." I suppose if you want to get all semantics-y, this is technically "AI," but...yeah, at its core, it's automation.

As ChristianM said, this technology imitates. If it were inventing, then it'd truly be AI--machine learning--which is of course the goal and if it's possible to do at the desired level will inevitably come to pass, but at the moment, this technology's biggest strength is imitation. And that is _hilariously_ easy for an algorithm to do. You all should know, because you already do it. It's called "learning your craft." It's literally what makes a professional musician a professional--we understand _what _musical parameters to tweak in order to gain the desired emotional response from the audience. It's a little cold to define it that way, but it's exactly what we do. Otherwise, none of us would be capable of composing music for dramatic projects. Doing it by "feel" just means your brain is wired in such a way that you can do it without explicitly thinking about it. Sort of how loads of people have to study their asses off in order to improvise in a jazz setting, but for some people like Jeff Beck, it's a natural thing for them and they already understand it. Their brains just already had that firmware installed! 

I mean, take for instance orchestration. It's a deep and life-long art, but you can automate the meat of the process to where you have something "good enough" by having instruments with like ranges playing the same part. That'd take four seconds to code, and the vast majority of the people in the world wouldn't notice, especially considering that the vast majority of orchestrated music they hear is probably Beethoven and Mozart, and for all their genius, their orchestration was pretty tame and "vanilla" by latter standards. So the audience would just hear flute doubling violin, french horn doubling viola or cello, etc, and they'd unconsciously go, "Yeah, that sounds like an orchestra." 

Same with more composition-related stuff. If I were to write an algorithm (using pseudo-code Javascript here) to define a Law & Order procedural track:

var genre = " ";

if(genre = procedural) {
var tempo = 112; // ideally we could randomize this within a given range, but that'd make the code more dense, so I'm leaving that out
var key = "D minor"; // same deal here
var signature = 4/4;
var chordProgression = [i, bVI || bVI/3] // this is just an example, obviously there'd be more potential chord progressions to cycle through
var harmonicRhythm = 2; // same deal here
}

Obviously you'd have to define that 112 is actually 112 bpm, that D minor contains these notes and not those, what the top 4 and the bottom 4 in 4/4 means, etc. And obviously this isn't complete--I'm just doing the easy lifting of defining the parameters, not the composition itself; you'd need to define sections of music, the order of the sections, what the instruments are, what patches they are, etc etc etc. And you'd of course have to tie this code to another application that can interpret it, can call the patches, can translate what you're writing as MIDI, stuff like that. 

But that's easy stuff. Really easy. People were doing this thirty years ago--it just sounded like shit because producing music on a computer was at its infancy. The production level is still the bane of these services, but come on--you can easily analyze a mix with Ozone and, just like I did above, break down whatever processes during mixing resulted in a quote-unquote "mix that's good enough for the general audience" into a pretty small and easy algorithm. Because at the end of the day, anything in music can be quantified. You don't need to use an LA-2A--you just need your bass guitar to _sound _like it's being run through one, which we can already do. Stuff like that.

There's a lot more to it than what I'm demonstrating here, but not _a whole lot_ more. Every single component of music can be broken down to its smallest part. We know this because we do this when we learn music. If you've ever taught music (break down concepts to their smallest parts), and you know how to code (translate those smallest concepts into something that a computer can interpret), then you could probably create one of these services yourself if you really, really wanted to.

Honestly, I actually hope this gains ground--and it will--and while the transition will be painful, on a _global, every-industry _scale, we'll be forced to reckon with the fact that most of the reasons to organize society in under a capitalist form will be negated. I'm not preaching that this will automatically = utopia, because you'd have to be pretty fucking stupid to look back at every other major junction in history and assume that everything will be peachy on the other end. There's a major chance that the trend could eventually result in extreme boot-on-face capitalism, extreme dystopia. Maybe it won't change much at all. 

But regardless, mass automation _is_ inevitable, and we composers will have to deal with it at some point too--and probably much sooner than you think. I've actually heard about Amper's plans, and they're big and coming fast, and money is literally being poured into them from those who will benefit (i.e. companies who currently pay composers our "expensive rates"). And since there's a chance that (again, on an _affecting literally everyone on the planet scale) _we humans actually _could _figure this whole "basically every job that people have in order to make money can be automated" thing, there's a _chance _that, who knows, maybe we can leave the bullshit Law & Order procedural music that most of us could write in our sleep to the computers, and--assuming that we're talking about one of the better possible timelines here--could instead focus on the art, leaving the algorithms and the computers to disrupt to their heart's content. That's a very rosey "post Work" future, and it's one of a number of possible timelines. And while I'm not super optimistic that we'll figure it out within my lifetime, I'm glad that this conversation is popping up pretty regularly now in pop culture. People are taking it seriously, and hopefully we can navigate it adequately. For now, I'd say I'm truly neutral. I could see automation going in a number of ways, and to be honest I expect we'll pass through all of them at some point, roughly worst to best with a lot of dips and valleys along the way. Broadly speaking that's how humans have dealt with every other paradigm shift.


----------



## AlexRuger (Jul 6, 2017)

jononotbono said:


> I think it's pretty obvious that when people use emulations of musical instruments, no matter how advanced they are and how fantastic they have started to sound, they are nothing like the real deal. Why do people think the human being can just be emulated? It's ridiculous. Music isn't maths. Music is emotion and no computer or algorithm feels emotion. People always want something new and exciting and the only thing capable of that is human chaos (the human being). You can't program that.



No, music is formed of tiny component parts, and when put together, elicit emotion. The computer doesn't have to _understand _emotion--what does that even mean, anyways. People write the code, and they understand what the emotional goal is, and therefore can define the parameters as they see fit to reach said emotional goal.

So, yes, you absolutely can program that. Programming a car to drive by itself is a metric _shit ton _more complicated than programming a computer to write some music. Bring in machine learning--a technology being pioneered in part by, for example, automated cars that share their experiences and therefore learn from each other--and yeah, the algorithms will be writing some damn convincing music. It's really just a matter of time because the concepts are sound and the technology is either here already or very, very close.

Edit: to be really clear, though, I'm not talking about writing music to picture. _That _is a complicated _decision-making _process, which obviously would take far more time to figure out. Not impossible by any means, but is much, much farther off than writing music that will later be _placed _to picture.


----------



## Rohann (Jul 6, 2017)

Replicant said:


> No matter how you slice it, this leads to a communist dystopia for us.
> 
> "We'll just have a universal basic income!"
> 
> ...


Not to mention pure automation, at least on an abstract level, makes everything _completely and utterly meaningless. _The fact that people still buy handmade furniture and steelwork made by blacksmiths (generally pretty stuff, rather than stuff no smith wants to make, like nails), as well as the fact that people still enjoy talking to one another in person (Skype and the telephone haven't replaced this) does give me hope (it's sort of like cheat codes in videogames -- it's boring as hell after a few minutes).

Life is indeed made significant through work (and makes civilization exist), but my confidence in CEO's of big corporations to actually have the foresight to put the two together is pessimistic at best (more people need to listen to people like Simon Sinek).


----------



## jononotbono (Jul 6, 2017)

AlexRuger said:


> No, music is formed of tiny component parts, and when put together, elicit emotion. The computer doesn't have to _understand _emotion--what does that even mean, anyways. People write the code, and they understand what the emotional goal is.
> 
> So, yes, you absolutely can program that. Programming a car to drive by itself is a metric _shit ton _more complicated than programming a computer to write some music. Bring in machine learning--a technology being pioneered in part by, for example, automated cars that share their experiences and therefore learn from each other--and yeah, the algorithms will be writing some damn convincing music. It's really just a matter of time because the concepts are sound and the technology is either here already or very, very close.



No you cannot program the exact point of creativity. That is the Human Chaos I talk about. Not someone programming stuff to do certain things. In hindsight everyone can recreate an emotion, a feeling, a whatever has actually happened but creating an algorithm to think like a human, with every possible thought and outcome which leads to THAT decision. No way. Nonsense. Music maybe built from components such as patterns, rhythms, Pitches, textures but it isn't written like that and no clever algorithm is ever going to be able to emulate the human brain. Sorry. I don't agree with your response. The human brain is ridiculously complex, the psychology behind why people make every decision is way beyond someone, a human, coding set rules to try and cobble together emulated human emotions.


----------



## Replicant (Jul 6, 2017)

jononotbono said:


> I think it's pretty obvious that when people use emulations of musical instruments, no matter how advanced they are and how fantastic they have started to sound, they are nothing like the real deal. Why do people think the human being can just be emulated? It's ridiculous. Music isn't maths. Music is emotion and no computer or algorithm feels emotion. People always want something new and exciting and the only thing capable of that is human chaos (the human being). You can't program that.



Music is physics and physics is math.

"Emotion" is our subjective feelings about it — it's not a real, tangible and magical thing that is required to create what we call music.

I don't mean to sound crass or rude here,

but a lot of these quasi-religious "Music can't be emulated by machines because it's emotion" type of posts are simply coping with the unfortunate reality that things we love and attach a higher meaning and purpose to, can in fact be reduced to algorithms that mimic how we learned to make it in the first place.

It takes away the feeling that art is of some higher, spiritual, inexplicable level of consciousness.


----------



## Rohann (Jul 6, 2017)

AlexRuger said:


> No, music is formed of tiny component parts, and when put together, elicit emotion. The computer doesn't have to _understand _emotion--what does that even mean, anyways. People write the code, and they understand what the emotional goal is, and therefore can define the parameters as they see fit to reach said emotional goal.


True, but _where_ the music is written from, emotionally, and _why,_ is perilous to ignore. Music is greater than simply the sum of its parts -- why do people appreciate minor flaws and errors in human playing, or the human voice?



Replicant said:


> Music is physics and physics is math.
> 
> "Emotion" is our subjective feelings about it — it's not a real, tangible and magical thing that is required to create what we call music.


While this is partly true, again, the same argument could be made in such a reductionistic manner about any art form.

Again, the same could be said of language. Any language is a mathematical system of sorts, but the breadth of nuances in how it is used, how meaning is interpreted, etc, is _ridiculously_ complex. I don't think AI will replace authors anytime soon, at least not good ones. This is also not factoring into account the general distaste this sort of idea creates in the minds of the public.



> I don't mean to sound crass or rude here,
> 
> but a lot of these quasi-religious "Music can't be emulated by machines because it's emotion" type of posts are simply coping with the unfortunate reality that things we love and attach a higher meaning and purpose to, can in fact be reduced to algorithms that mimic how we learned to make it in the first place.
> 
> It takes away the feeling that art is of some higher, spiritual, inexplicable level of consciousness.


While I agree with a significant portion of what you say, the implication that there is no meaning and purpose in these things comes from the assumption in a purely mechanistic universe (which is to imply that meaning and purpose are nonexistent to begin with). The experience of art in this form isn't purely in the art itself, I'd argue; it's in the human connection made through effective art.

While I do think computers will be able to emulate "emotional" music at some point, the significance and meaning of music will be lost. To deny the significance of human connection and expression in music is insane.


----------



## AlexRuger (Jul 6, 2017)

Guys, you aren't seeing my point. Like I said, *we are not talking about inventing here. *That's not what these technologies do, so thinking that they'll replace the human creative process is beside the point. I'm absolutely with you that, at least for the foreseeable future, creativity can not be represented algorithmically. It's just too complex, chaotic, unpredictable. I know someone who is currently studying this at Columbia (specifically, quantifying improvisation), and while it helps that he's absolutely brilliant and I'm not, I've seen his work and it's just impossibly dense and complex. We're talking about billions of potential decisions a person could make at any given time, and all of those are informed by past decisions, brain chemistry, our current mood, on and on and on...

I'm talking about what these technologies are targetting: commercial music that exists within well-established idioms. And creating music that is _good enough _for the people who are buying it, at a price far lower than any human composer could demand, is not only possible, but in the context of all of programming, stupid easy.

Edit: People in this thread are conflating creativity itself with music. Creativity can not at this moment be quantified--but music absolutely can. These "AI composer" services aren't targeting creativity itself--that's so vague it's meaningless, anyways--but they are targeting the creation of music via the parts of it that _can _be quantified. They start small with easy-to-break-down idioms, and will get more complex as 1) people use them and add to their pool of data from which machine learning can occur (why else do you think they'd allow you to tweak the music yourself--they want to learn from musicians!), and 2) machine learning itself becomes more sophisticated. It's exactly the same for every automation service, such as self-driving cars.


----------



## Rohann (Jul 6, 2017)

AlexRuger said:


> Guys, you aren't seeing my point. Like I said, *we are not talking about inventing here. *That's not what these technologies do, so thinking that they'll replace the human creative process is beside the point. I'm absolutely with you that, at least for the foreseeable future, creativity can not be represented algorithmically. It's just too complex, chaotic, unpredictable. I know someone who is currently studying this at Columbia (specifically, quantifying improvisation), and while it helps that he's absolutely brilliant and I'm not, I've seen his work and it's just impossibly dense and complex. We're talking about billions of potential decisions a person could make at any given time, and all of those are informed by past decisions, brain chemistry, our current mood, on and on and on...
> 
> I'm talking about what these technologies are targetting: commercial music that exists within well-established idioms. And creating music that is _good enough _for the people who are buying it, at a price far lower than any human composer could demand, is not only possible, but in the context of all of programming, stupid easy.


Definitely agree with you here. I'm more worried about human laziness and dollar signs in the eyes of businesspeople in this sense. I have no doubt algorithms could eventually replicate the kind of radio nonsense or generic commercial music one hears in a given point in a day; it's already intentionally (tightly) systematic in many areas of the music industry. I read an interview with Devin Townsend where he talked about working with Nickleback's producers once, and how certain chord progressions and lyrical topics (as well as structure and instrumentation) were for specific seasons and release times in the year. He also talked about how much that disgusted him and that he'd never let the garbage they wrote together see the light of day.


----------



## Nokatus (Jul 6, 2017)

Rohann said:


> While I agree with a significant portion of what you say, the implication that there is no meaning and purpose in these things comes from the assumption in a purely mechanistic universe (which is to imply that meaning and purpose are nonexistent to begin with). The experience of art in this form isn't purely in the art itself, I'd argue; it's in the human connection made through effective art.
> 
> While I do think computers will be able to emulate "emotional" music at some point, the significance and meaning of music will be lost. To deny the significance of human connection and expression in music is insane.



Are you implying meaning and purpose [in the sense you are using these concepts] are something existing in the world as independent entities, instead of constructs taking shape as the result of the way we humans perceive, think, feel, act, experience and so forth?


----------



## jononotbono (Jul 6, 2017)

AlexRuger said:


> I'm talking about what these technologies are targetting: commercial music that exists within well-established idioms. And creating music that is _good enough _for the people who are buying it, at a price far lower than any human composer could demand, is not only possible, but in the context of all of programming, stupid easy.



Well I thought this was common knowledge? haha Sorry Alex. Glad we're on the same sheet. We're well past this point. AI not even necessary. Just go and pick some music that's in a Royalty Free library that you can buy the license for £6 and nobody who listens to it including the human that "wrote" it is even aware of how terrible and generic it is. The bar is as low as it can already humanly possibly be so on that note... and thinking about it, I hope AI gets good!


----------



## novaburst (Jul 6, 2017)

DSmolken said:


> I think a lot of people are hoping that when AI technology gets better and starts pushing humans



I know you did not mean this Quote. but I like to think of it like this pushing humans but not out of the music industry, but pushing humans to be more creative with music, so in this respect A I is great so bring it on let A I try to take over the music industry, all it will do is push for greater better, and more creative composers.

We all get inspired to do better by one another, the bench mark is always rising if A I set a high bench mark the we will rise above it,......... its that simple


----------



## AlexRuger (Jul 6, 2017)

jononotbono said:


> We're well past this point. AI not even necessary. Just go and pick some music that's in a Royalty Free library that you can buy the license for £6 and nobody who listens to it including the human that "wrote" it is even aware of how terrible and generic it is.


Not just royalty free libraries. These services will very easily replace highly professional (but creatively barren/boring/lame/etc) music too. In the case of a lot of that stuff, production level alone is what sets apart a track on Extreme Music from Audio Jungle, since in terms of composition they're so simple (not a bad thing!) and basically interchangeable.


----------



## Nokatus (Jul 6, 2017)

Fer said:


> The engenieer has inteligently established a set of rules to assign the highter number to the most powerful movement. But the software doesnt know what a number is or that five is more than two or that the queen is much more important than the pawn; is the engenieer who knows it. If he defines that 2 is more than 5 then the software will asume it without problems and will loose against Carlsen in an instant. So at the end there is only a dead machine working as excepted.





Fer said:


> i can imagine a software with defined instructions like: “if a sudden expectation is needed go from the dominant chord to the bVI chord instead of going to the tonic”. I think that in a future perhaps this could work for making correct/generic music, which eventually could be replacing some composers jobs. But not for making something totally beyond the defined rules of the sofware.



We are already way past this sort of rule-based developer-defined approach. One of the biggest challenges in AI, currently, is actually understanding (after the fact) why an AI made a particular action. The actions aren't preconfigured anymore, in the fashion you describe. The paradigms these days are different than the rigid engineer established, do-this-don't-do-that, etc, algorithms were.

This is a nice read on that subject, despite the rather dramatic headline: https://www.technologyreview.com/s/604087/the-dark-secret-at-the-heart-of-ai/

Also, chess and its traditional developer driven weighting rules you are referring to... are quite old beans . Instead, the fact that a deep learning system was able to beat a go master, that's quite something actually: https://www.wired.com/2016/01/in-a-...gles-ai-beats-a-top-player-at-the-game-of-go/


----------



## jononotbono (Jul 6, 2017)

AlexRuger said:


> Not just royalty free libraries. These services will very easily replace highly professional (but creatively barren/boring/lame/etc) music too. In the case of a lot of that stuff, production level alone is what sets apart a track on Extreme Music from Audio Jungle, since in terms of composition they're so simple (not a bad thing!) and basically interchangeable.



Which brings us finally to my opinion of the truth. Anybody that pushes themselves, is unique, has individuality and is more than just a xerox machine, churning out soulless track after soulless track, will have longevity. I say usually because even the best churn stuff out. Bills have to be paid and love certainly doesn't pay them. There is nothing wrong with this. At the end of the day, human beings want new and exciting things so whilst it may take time for the masses to get sick of a trend or paradigm, people sure do get sick of being force fed the same old stuff time after time. This stuff only has a short shelf life before a new trend has to mix the diet up and this is usually set by the trailblazers. I'm not gonna start worrying about AI. Ever. It sounds shit.


----------



## Replicant (Jul 6, 2017)

jononotbono said:


> At the end of the day, human beings want new and exciting things so whilst it may take time for the masses to get sick of a trend or paradigm, people sure do get sick of being force fed the same old stuff time after time. This stuff only has a short shelf life before a new trend has to mix the diet up and this is usually set by the trailblazers. I'm not gonna start worrying about AI. Ever. It sounds shit.



Which is what everyone whose job was replaced by a robot said at one time.

I have no idea why you think that if you're capable of coming up with something "new", which doesn't really exist in music anyway, why an AI would _not_ be capable of doing the same.

All that needs to be done is for the AI to throw in some subversion of expectations or cross-reference millions of existing works and create one that has something obviously different about it or contains uncommon/unusual traits.


----------



## jononotbono (Jul 6, 2017)

Replicant said:


> All that needs to be done is for the AI to throw in some subversion of expectations or cross-reference millions of existing works and create one that has something obviously different about it or contains uncommon/unusual traits.



And how is this algorithm supposed to tell what people want? It can't read minds and figure out what to churn out that will be the next big thing. It will always be behind the curve writing done to death or irrelevant music that the masses don't engage with which is what @AlexRuger and myself have just been talking about. AI will never know what the people want and therefore no matter how many components from pre existing works the algorithm may reference in order to come up with something unique won't mean anything. Plenty of people try to be different and that's their very problem. They try too hard.


----------



## Replicant (Jul 6, 2017)

jononotbono said:


> And how is this algorithm supposed to tell what people want? It can't read minds and figure out what to churn out that will be the next big thing. It will always be behind the curve writing done to death or irrelevant music that the masses don't engage with which is what @AlexRuger and myself have just been talking about. AI will never know what the people want and therefore no matter how many components from pre existing works the algorithm may reference in order to come up with something unique won't mean anything. Plenty of people try to be different and that's their very problem. They try too hard.



Like I was saying, you're speaking in this absolute of

"AI will never..."

Yeah, well AI was never supposed to be a champion GO player for another 20 years, either.

an AI could know what people want in the same you know what people want — by evaluating their reactions, being told or simply asking.


----------



## bigcat1969 (Jul 6, 2017)

EDM isn't written by AI?


----------



## ChristianM (Jul 6, 2017)

Replicant said:


> an AI could know what people want in the same you know what people want — by evaluating their reactions, being told or simply asking.



Not "know," because an AI dont have a conscience, but yes, with big data you can build predictions.

With the big data,the machine can make predictions, but in no way the machine will "understand" what it predicted, hence its difficulty to project itself in totally new situations.
An "AI" is in itself just a *conditioning* that would be independent of a spirit ...


----------



## jononotbono (Jul 6, 2017)

bigcat1969 said:


> EDM isn't written by AI?



Tron actually.


----------



## Lotias (Jul 6, 2017)

I think AI as assistant tools would be very interesting. People in this thread talk a lot about the AI learning from musicians, but I think there's also a possibility that musicians could learn from AI. In programming it _is_ possible to add an element of randomization, and even further from that, it _is_ possible for the AI to make a sort of 'educated gamble'. And far enough down the road, it might end up being something you never considered but might just work.

A second thought I had is that beginning composers could maybe learn a bit from composing AI, as well.


----------



## jononotbono (Jul 6, 2017)

Lotias said:


> I think AI as assistant tools would be very interesting. People in this thread talk a lot about the AI learning from musicians, but I think there's also a possibility that musicians could learn from AI. In programming it _is_ possible to add an element of randomization, and even further from that, it _is_ possible for the AI to make a sort of 'educated gamble'. And far enough down the road, it might end up being something you never considered but might just work.
> 
> A second thought I had is that beginning composers could maybe learn a bit from composing AI, as well.



Yeah for sure. I mean, great musicians learn from everything (not limited to anything musical) so why not!


----------



## Rohann (Jul 6, 2017)

Replicant said:


> Like I was saying, you're speaking in this absolute of
> 
> "AI will never..."
> 
> ...


Your language implies a belief in the possibility of hard AI, unless you're speaking metaphorically.
I strongly believe "soft" AI is in the future (and obviously the present), but the plethora of problems (logical and otherwise) with mechanistic mind (and hard AI) is revealing (i.e. the hard problem of consciousness, etc).



Nokatus said:


> Are you implying meaning and purpose [in the sense you are using these concepts] are something existing in the world as independent entities, instead of constructs taking shape as the result of the way we humans perceive, think, feel, act, experience and so forth?


Good question. No...and yes?
Meaning is a product of thought content and intentionality (an act of projecting), which is a mental phenomenon (i.e. language -- a code set of abstract shapes and sounds used in a particular manner enables one to effectively communicate incredibly dense and complex ideas, but the meaning isn't inherent in the shapes themselves). This is no way implies it is less important or significant.
Purpose, on the other hand...this is a topic for a much larger discussion than is appropriate for this forum, but I think purpose is both...sort of. This concept makes more sense given theistic and teleological arguments, but even in those cases purpose is dependent on a mind (or "Mind").

Relevant to this conversation, I don't think hard AI is possible because I don't think mind is purely physical/biological. I think a good illustration of the problems with hard AI come from Searle's "Chinese Room" argument, as well as some of Nagel's arguments. Even if one does think the human mind is purely biological, the problems with a computational approach to the mind are extremely difficult to resolve.


----------



## Eric G (Jul 6, 2017)

Lotias said:


> I think AI as assistant tools would be very interesting. People in this thread talk a lot about the AI learning from musicians, but I think there's also a possibility that musicians could learn from AI. In programming it _is_ possible to add an element of randomization, and even further from that, it _is_ possible for the AI to make a sort of 'educated gamble'. And far enough down the road, it might end up being something you never considered but might just work.
> 
> A second thought I had is that beginning composers could maybe learn a bit from composing AI, as well.



I am right with you on this. Hopefully these AI powered programs allow experimentation (Orb seems to be like this) . I hope to learn about chord progressions, orchestration combinations and other musical elements I have never tried before. With AI assisting me I hope to grow as a composer and stay one step ahead of the competition. Including a stand alone AI.


----------



## AlexRuger (Jul 6, 2017)

Rohann said:


> Even if one does think the human mind is purely biological...



What would lead you to believe that it's anything but?


----------



## Replicant (Jul 6, 2017)

Rohann said:


> Your language implies a belief in the possibility of hard AI, unless you're speaking metaphorically.
> I strongly believe "soft" AI is in the future (and obviously the present), but the plethora of problems (logical and otherwise) with mechanistic mind (and hard AI) is revealing (i.e. the hard problem of consciousness, etc).
> 
> 
> ...



My entire point is best summed up in saying that we don't _need_ hard AI for an AI to be able to create music that humans can "emotionally connect" with, if you will, in the same way that we do with music created by humans.

I also disagree that "hard AI" isn't possible given your suggestion of the existence of a "soul" which I don't personally believe exists, but is rather a consequence of complex biological processes within the brain which we don't understand and perhaps don't _need_ to understand to set the wheels of its evolution occurring in code in motion. I think this is already demonstrated on a small scale by that article someone linked about Nvidia's driverless car — but that's a completely different conversation that will go nowhere, anyway.


----------



## Nokatus (Jul 6, 2017)

Rohann said:


> Relevant to this conversation, I don't think hard AI is possible because I don't think mind is purely physical/biological. I think a good illustration of the problems with hard AI come from Searle's "Chinese Room" argument, as well as some of Nagel's arguments. Even if one does think the human mind is purely biological, the problems with a computational approach to the mind are extremely difficult to resolve.



Ah, that goes into the territory of religion pretty much, then. There's nothing tangible to support something "magical" (not in a disparaging sense, but quite literally, something so great and of "spiritual origin" so that one could behold it as a miraculous thing) at work in the human mind; instead, there's just a belief, the one you are currently expressing, that qualia of different experience/emotion/etc states are something so intrinsically personal and human-like that an artificial consciousness couldn't experience similar things. So, in the end, it's a topic more suitable for personal beliefs and religion and such. Meanwhile, cosciousness and AI research will carry on  regardless of beliefs or opinions one way or the other.

The thing I'm personally most pessimistic about is, the time might come surprisingly soon that there are artificial systems capable of having negative experiences -- say, on the level of animals. Attitudes like the one you're demonstrating will have an effect on our ethics toward systems like that. What rights will they have compared to biological entities?

The Chinese Room argument is a deeply flawed one in critiquing the AI concepts of today. It was constructed in an era when the approach to AI was the descriptive, rigid rule based algorithm kind, and it points out shortcomings in that line of thinking. However, what the future of AI looks like today is about artificial learning systems that shape and adapt their own functions and reactions through a dynamical, empirical learning model.

It's about biology inspired systems and providing them with the means to grow and adapt through experience based interaction and raw input, instead of trying to devise specific models of processing high level abstract information as-is, differing from case to case (the rule based "do this, don't do that, this way you'll manage doing this specific task and sort of appear intelligent" approach).


----------



## Eric G (Jul 6, 2017)

If someone starts talking about the Singularity in this thread, then we have gone too far off topic


----------



## dpasdernick (Jul 6, 2017)

I wonder... When Skynet becomes self-aware will it write Muskrat Love? Because that would be da shit...

https://m.youtube.com/#/watch?v=xBYV_7a0FQs


----------



## Kyle Preston (Jul 6, 2017)

AlexRuger said:


> Guys, you aren't seeing my point. Like I said, *we are not talking about inventing here. *That's not what these technologies do, so thinking that they'll replace the human creative process is beside the point. I'm absolutely with you that, at least for the foreseeable future, creativity can not be represented algorithmically. It's just too complex, chaotic, unpredictable. I know someone who is currently studying this at Columbia (specifically, quantifying improvisation), and while it helps that he's absolutely brilliant and I'm not, I've seen his work and it's just impossibly dense and complex. We're talking about billions of potential decisions a person could make at any given time, and all of those are informed by past decisions, brain chemistry, our current mood, on and on and on...
> 
> I'm talking about what these technologies are targetting: commercial music that exists within well-established idioms. And creating music that is _good enough _for the people who are buying it, at a price far lower than any human composer could demand, is not only possible, but in the context of all of programming, stupid easy.
> 
> Edit: People in this thread are conflating creativity itself with music. Creativity can not at this moment be quantified--but music absolutely can. These "AI composer" services aren't targeting creativity itself--that's so vague it's meaningless, anyways--but they are targeting the creation of music via the parts of it that _can _be quantified. They start small with easy-to-break-down idioms, and will get more complex as 1) people use them and add to their pool of data from which machine learning can occur (why else do you think they'd allow you to tweak the music yourself--they want to learn from musicians!), and 2) machine learning itself becomes more sophisticated. It's exactly the same for every automation service, such as self-driving cars.



I started a thread on this topic a month ago and came to the same conclusion. AI doesn't have to get better at *inventing* than composers – it just has to fool most listeners. If/when it does that, it *will be* a competitor for work (at least at companies that can't tell the difference and want to save money).

Also relevant, an awesome NPR broadcast on the topic. If you have 8min, it's worth a listen. I love that there's a group of characters that open up music boxes (designed and built by machines) with the intention of adding flaws and wrong notes – just to fuck with people.


----------



## Replicant (Jul 6, 2017)

Simply put:

We're fucked.


----------



## novaburst (Jul 6, 2017)

@Replicant this may be the case if you only do compositions for money,

But if you love making music you will keep on doing it even if A I is taking over or not,

Remember we have millions of composers writing, and creating music does that stop you, or are you screwed becuase we have so many human composers, answer is no.

So why are you screwed if A I joined the game, 

At the end of the day listeners will choose to purchase what they want, 

If the music humans create is not cutting it why not go for A I, but just like preference in the human world we choose and have feel for certain compersitions if that compersition we choose happens to be human that created it that's cool, if it happens to be A I that's cool too.

What ever, or what is is what is ........the customer will have the last say.

Do people go for music becuase it's cheap or becuase it touched them A I may be a very cheap version of music but if your music touched someone's heart they will pay a good money for your piece.


----------



## NoamL (Jul 6, 2017)

Interesting discussion. I can speak to it with *a little* bit of experience because I developed my own music AI, called Hyperion.. it ended up not being as useful as I wanted. Forgot if I've posted about it before here. (more about Hyperion in a bit and I will share some of its "music" output)

These tech startups are going down exactly the wrong path, of course. I do see their logic: _"Hey, there is already a computer-parsable language that encodes musical meaning: MIDI! So we can start by teaching our AI how to read MIDI! And then, there are hundreds of thousands of real musical pieces that have been encoded into MIDI! So we just chuck all of Schubert down the hatch and let the AI do its machine-learning magic!"_ And you end up with an AI that spits out garbage, piano roll music or generic-as-hell 4 bar progressions. But it's enough to make the venture capitalists imagine what's theoretically possible and throw $5 million at the startup.

I don't question the premise that music can be encoded as information, but the important information in music obviously doesn't exist at the MIDI note level or even the program level, it's structural. So, you can't get around the hard part. If you want to teach a computer how to compose, you have to teach it composition. If you ONLY want the AI to be a "composer's assistant" then some music knowledge can be abstracted away.

So the AI that I created, called Hyperion, was aiming to be that second sort of AI. I fed it a ton of short themes from Remote Control Productions composers & superhero films - things like Silvestri's Avengers, Jackman's X-Men First Class, Djawadi's Pacific Rim theme, and so on.

The only goal I was trying to achieve with the AI was to get it to write short catchy themes, motifs really.

I need to take a quick aside to explain the musical language I used with Hyperion so you can understand it.

The first piece Hyperion learned was Jackman's theme for X-Men First Class:



Which I taught to Hyperion like this:

*C_START(C,G,C) Eb_Maj(Bb) F_Maj(A,G,F) Ab_Maj(Eb) Bb_Maj(D) C_STOP*

Hyperion knows music as "strings" of code, each piece is a astring.

All strings are transposed to the key of C, and they all start & end with C.

The "words" of the language are chords. Eb_Maj is Eb Major.

Any chord can have melody notes attached, displayed in parentheses. Passing or weak notes have a minus sign "-" in front of them, but other than that, the language doesn't know or care about any rhythmic aspects of music.

So here's another theme:

*C_START(C) Ab_Maj(C,-G) F_Maj(F) Ab_Maj(Eb) Bb_Maj(D) C_STOP
*
See if you can work out at the piano which movie this was 

I fed Hyperion about 40-50 different strings like this, and the AI developed a "mental map" of connections. It looked a little like this:







Then I let Hyperion randomly generate music. I used something called "Markov chains" - this just means the computer plays a slightly random game of hopscotch, where it's standing on a chord, and it thinks about all the chords it could jump to, and it assigns them probabilities based on the music examples it knows. Then, every time the AI jumps to a new chord, it "populates" that chord with some melody notes, again by consulting a probability table generated by combining all of the music it knows. So the AI actually IS NOT writing a melody horizontally. But the result still sounds okay.

Since this is a true "composer assistant" software, it's up to the composer to decide if Hyperion's output is any good, or if it's inspiring enough to add rhythm, orchestration, development, etc.

If you want to play around with it, I just loaded Hyperion up again this evening and asked it for 100 musical themes. Two seconds later I got this list. Whip out your piano keyboard and see if any of these ideas could be the seed for your next composition!

(The output list is pretty long so I'll put it in the next post)

Hyperion was an interesting experiment but I thought that in the end, *the outputs sound too much like the inputs.* Hyperion simply doesn't know how to write things it hasn't heard. Hyperion would never use a B major chord if I hadn't given it "Dream Is Collapsing" (and a lot of trailer tracks that, incidentally, _also_ were written after 2010... so _they_ were copying HZ too). And when I experimented with adding random fuzz to Hyperion's probability tables, the points in the music where Hyperion stepped away from the established probability table INSTANTLY poked out: they sounded like wrong notes. This, to me, is reassuring. It shows why composers aren't going away. Because on a music theory level, the 4 chord progression that HZ wrote for Inception is deeply weird and novel. But it's also accessible and feels logical, inevitable even. Achieving those things at the same time is something that AIs are probably going to be bad at for a very long time. It's like asking AI to invent a flavor that nobody's ever tasted, but yet also have high confidence that people will like it.


----------



## NoamL (Jul 6, 2017)

*HYPERION OUTPUTS JUL 6 2017*

C_START(C,-G) Eb_Maj(F,-C) F_Maj(G,A) C_STOP

C_START(C,C) Eb_Maj(Bb,Eb) Ab_Maj(Eb) Bb_Maj(Bb,-Bb) G_Maj(G,B) C_STOP

C_START(G,C) B_Maj Ab_Maj(Ab,C) Bb_Maj(F) G_Maj(D) C_STOP

C_START(C) D_Maj(A) Ab_Maj(Eb) Bb_Maj(D) G_Maj(G,G) C_STOP

C_START(G,-D) D_Maj(D) Ab_Maj(C) Eb_Maj(G) G_Maj(D,B) C_STOP

C_START(C,-G) D_Maj(D) Ab_Maj(Eb,-D) Bb_Maj(Bb,-Bb) G_Maj(D,D) C_STOP

C_START(C,-Bb) F_Maj(F,F) Ab_Maj(Eb) C_STOP

C_START(G) D_Maj(D,D) Ab_Maj(C,-G) C_STOP

C_START(C) B_Maj(B) Ab_Maj(C) Bb_Maj(D,D) G_Maj(G,G) C_STOP

C_START(G) F_Maj(A,-Eb) Ab_Maj(Eb) F_Maj(C) G_Maj(D,B) C_STOP

C_START(C,G) B_Maj(B) Ab_Maj(C) Eb_Maj(Eb,-Bb) G_Maj(G) C_STOP

C_START(G) F_Maj(G,-F) Ab_Maj(C,-F) Eb_Maj(Bb,-Bb) G_Maj(G,B) C_STOP

C_START(E,-G) F_Maj(A,-Eb) G_Maj(D,B) C_STOP

C_START(C,-C) Eb_Maj(Bb,G) G_Maj(G) C_STOP

C_START(G,Eb) D_Maj(D,D) Ab_Maj(Eb) Bb_Maj(Bb) G_Maj(D) C_STOP

C_START(C,G) D_Maj(A) Ab_Maj(G,C) C_STOP

C_START(C) Ab_Maj(Ab,C) Bb_Maj(D,-Bb) C_STOP

C_START(C) F_Maj(G,F) G_Maj(G) C_STOP

C_START(C) F_Maj(C,A) Ab_Maj(C,-G) C_STOP

C_START(C) Ab_Maj(Eb,-D) Bb_Maj(D) C_STOP

C_START(C,-D) Eb_Maj(G,F) Bb_Maj(F,-Bb) C_STOP

C_START(C) Eb_Maj(Bb) Bb_Maj(D,-Bb) C_STOP

C_START(C) Eb_Maj(G) Ab_Maj(Ab,Eb) F_Maj(A,-C) G_Maj(G,B) C_STOP

C_START(C,G) B_Maj Ab_Maj(G) C_STOP

C_START(C) F_Maj(F,F) G_Maj(G,B) C_STOP

C_START(C,C) F_Maj(A,-Eb) G_Maj(D) C_STOP

C_START(C,-D) D_Maj(D) Ab_Maj(C,-G) Eb_Maj(C,Eb) G_Maj(G,B) C_STOP

C_START(G,-C) Eb_Maj(Eb,-Bb) Ab_Maj(C,Eb) F_Maj(A) G_Maj(G,D) C_STOP

C_START(G) Eb_Maj(G) G_Maj(G,D) C_STOP

C_START(C,-C) D_Maj(A) Ab_Maj(G,Eb) Eb_Maj(G,-A) Bb_Maj(D,-Bb) C_STOP

C_START(C,G) B_Maj Ab_Maj(C) C_STOP

C_START(C,G) Eb_Maj(Bb) G_Maj(D,D) C_STOP

C_START(Eb) D_Maj(D,A) Ab_Maj(Ab,-Eb) C_STOP

C_START(C,C) Eb_Maj(Bb,G) Ab_Maj(C) C_STOP

C_START(G,-G) D_Maj(A,A) Ab_Maj(Eb) C_STOP

C_START(C,-Eb) D_Maj(D) Ab_Maj(Ab,Eb) C_STOP

C_START(C,G) Eb_Maj(C,-G) G_Maj(D) C_STOP

C_START(C,-C) D_Maj(F#) Ab_Maj(Eb,-Eb) C_STOP

C_START(G) F_Maj(G,F) G_Maj(D) C_STOP

C_START(C,C) F_Maj(A) G_Maj(G) C_STOP

C_START(G) D_Maj(D) Ab_Maj(C) F_Maj(A,F) G_Maj(G) C_STOP

C_START(C,-G) Eb_Maj(Bb,-G) F_Maj(G) C_STOP

C_START(G) B_Maj(B) Ab_Maj(C,-G) Bb_Maj(F,F) G_Maj(D) C_STOP

C_START(C,-C) Eb_Maj(F,Eb) G_Maj(D) C_STOP

C_START(G,-D) F_Maj(F,-F) G_Maj(G) C_STOP

C_START(G) D_Maj(D) Ab_Maj(Eb) Eb_Maj(Eb) Ab_Maj(F,C) C_STOP

C_START(C,-C) B_Maj Ab_Maj(Eb,Eb) C_STOP

C_START(C,-C) F_Maj(F) G_Maj(D) C_STOP

C_START(C,-Bb) D_Maj(D) Ab_Maj(C,-D) C_STOP

C_START(C) D_Maj(D) Ab_Maj(Eb,Eb) Bb_Maj(F) G_Maj(D,D) C_STOP

C_START(G) Eb_Maj(G,-G) F_Maj(G,A) C_STOP

C_START(G) B_Maj(B) Ab_Maj(C,-G) C_STOP

C_START(G) B_Maj Ab_Maj(Ab,-C) Bb_Maj(D,F) G_Maj(G,G) C_STOP

C_START(Eb) F_Maj(C,F) G_Maj(D,B) C_STOP

C_START(G,C) B_Maj Ab_Maj(G,C) C_STOP

C_START(G) D_Maj(D) Ab_Maj(Eb) Eb_Maj(Bb) G_Maj(D,B) C_STOP

C_START(Eb,C) Eb_Maj(C) G_Maj(D) C_STOP

C_START(G,C) B_Maj Ab_Maj(Eb,-Eb) Eb_Maj(Bb,Bb) Ab_Maj(C,-D) Eb_Maj(Eb,F) Ab_Maj(Eb) C_STOP

C_START(G,-G) B_Maj Ab_Maj(G,-G) Bb_Maj(F,C) G_Maj(G,G) C_STOP

C_START(G,-C) Eb_Maj(G) G_Maj(G,D) C_STOP

C_START(Eb,C) Eb_Maj(F,Bb) G_Maj(G,G) C_STOP

C_START(G,G) Eb_Maj(G) G_Maj(G,G) C_STOP

C_START(G,C) D_Maj(D) Ab_Maj(Ab) C_STOP

C_START(C) B_Maj(B) Ab_Maj(C) Eb_Maj(Eb,Bb) Bb_Maj(C,-Bb) C_STOP

C_START(C,-D) B_Maj Ab_Maj(C,-F) C_STOP

C_START(G) B_Maj Ab_Maj(Eb,G) F_Maj(G,A) Ab_Maj(C,-C) Eb_Maj(Eb) Ab_Maj(Eb,-G) C_STOP

C_START(G,G) D_Maj(F#,F#) Ab_Maj(Ab,C) Eb_Maj(G) G_Maj(D) C_STOP

C_START(Eb) B_Maj Ab_Maj(Eb,Eb) C_STOP

C_START(C,-G) B_Maj(B) Ab_Maj(Eb) Bb_Maj(F) G_Maj(D) C_STOP

C_START(C,-D) F_Maj(C) Ab_Maj(Eb,-Eb) Eb_Maj(Bb,-G) G_Maj(D,B) C_STOP

C_START(C,-G) Ab_Maj(F) Bb_Maj(D,-Bb) C_STOP

C_START(G,C) B_Maj Ab_Maj(Ab,-C) C_STOP

C_START(G,-D) F_Maj(A,-Eb) G_Maj(D,B) C_STOP

C_START(G) D_Maj(D,D) Ab_Maj(Eb,-Eb) C_STOP

C_START(G,C) B_Maj Ab_Maj(Eb) F_Maj(F) Ab_Maj(C,Ab) F_Maj(F) G_Maj(D) C_STOP

C_START(C,G) B_Maj Ab_Maj(Ab,Ab) C_STOP

C_START(C,G) D_Maj(D) Ab_Maj(F,-G) C_STOP

C_START(C,-G) Eb_Maj(Bb,-C) Ab_Maj(C) Bb_Maj(D) G_Maj(D,D) C_STOP

C_START(C,C) D_Maj(F#) Ab_Maj(Ab,Eb) C_STOP

C_START(C,-Bb) B_Maj(B) Ab_Maj(Eb,-Eb) C_STOP

C_START(G,C) F_Maj(F,F) Ab_Maj(Eb,-F) C_STOP

C_START(C,-D) Eb_Maj(G,G) F_Maj(F,-D) C_STOP

C_START(C) F_Maj(F,A) Ab_Maj(C,-Eb) Eb_Maj(Eb) Ab_Maj(Eb,C) C_STOP

C_START(E,C) B_Maj Ab_Maj(C,C) C_STOP

C_START(C) Ab_Maj(G) F_Maj(F,-Eb) Ab_Maj(C,Eb) F_Maj(F,-C) C_STOP

C_START(C,G) F_Maj(F,A) G_Maj(D) C_STOP

C_START(C) B_Maj(B) Ab_Maj(C) Bb_Maj(Bb,-Bb) G_Maj(D) C_STOP

C_START(C,G) D_Maj(A,D) Ab_Maj(Eb) Bb_Maj(D,-Bb) G_Maj(G) C_STOP

C_START(C,-G) D_Maj(F#) Ab_Maj(Ab,-C) F_Maj(A,-D) Ab_Maj(C) Bb_Maj(D) G_Maj(D,B) C_STOP

C_START(C,C) D_Maj(A) Ab_Maj(Eb,-D) C_STOP

C_START(G,G) B_Maj Ab_Maj(Eb,Eb) Eb_Maj(G) Ab_Maj(Ab,Ab) Bb_Maj(D) G_Maj(G,B) C_STOP

C_START(C,G) Ab_Maj(C) Eb_Maj(Bb) F_Maj(A,-Eb) G_Maj(G,D) C_STOP

C_START(C,-G) F_Maj(G,-Eb) Ab_Maj(Eb) Bb_Maj(D) G_Maj(G) C_STOP

C_START(C,C) Eb_Maj(G) G_Maj(G) C_STOP

C_START(G,-D) F_Maj(G,A) G_Maj(D,B) C_STOP

C_START(Eb) F_Maj(F,G) G_Maj(G) C_STOP

C_START(G) B_Maj(B) Ab_Maj(C) C_STOP

C_START(G) B_Maj Ab_Maj(C,C) C_STOP

C_START(E,-D) B_Maj(B) Ab_Maj(C) C_STOP

C_START(Eb,C) Eb_Maj(Bb) Ab_Maj(Eb,Eb) C_STOP


----------



## NoamL (Jul 6, 2017)

Also one thing nobody has pointed out yet, is there is an analogous set of products "threatening" the film-editing world. But still nobody edits their movie in iMovie, right? Not even professional YouTubers, exactly the sort of people you'd imagine Apple is aiming for (need lots of videos edited fast, low expertise). They all still edit in Final Cut or Premier. So.. why not iMovie? Because it's limited to the generic, and the generic quickly becomes an EYESORE. The only iMovie you were ever impressed by is the first iMovie you ever saw; after that you just said "Ah, I recognize this text zoom effect - you made this in iMovie."


----------



## Replicant (Jul 6, 2017)

novaburst said:


> @Replicant this may be the case if you only do compositions for money,
> 
> But if you love making music you will keep on doing it even if A I is taking over or not,
> 
> ...



When I say "we're fucked", I mean:

I will keep making music regardless; that is, assuming I can even afford to eat when automation inevitably brings about the dystopia in coming decades. I am convinced it is going to happen because full automation and AI are inevitable for like...90% of all industries and it either leads to a golden age or dystopia where the middle class is but a distant memory.

However, dystopia is the correct answer because any of the "optimistic" predictions have already been, time and again, proven to be infeasible; thus leaving only one option. A UBI is not sustainable, discourages entrepreneurship, and renders the population a slave to that income. Human civilization is built on work and machines aren't being designed to augment it anymore, they're being designed to replace it. Even if a universal income was sustainable, it will render us all vegetables. The majority of displaced workers won't take up the clarinet or a paint brush; they'll take up arms. Crime will rise, anti-tech platforms will be a common theme of right-wing politicians, etc. and the only people who could change it (CEOs and businesses) can't be trusted to take the necessary action. Don't even get me started on the Sci-Fi BS of "the singularity" no one wants to be hooked up to a machine, especially in the age of hacking everything down to pace makers and there's no privacy as it is.

In 10-20 years, I doubt people will have money to buy my music at all — they'll be too occupied trying to prevent their families from starving and getting robbed.

I used to say people that posted this kinda stuff were just nutjob survivalists, but with each passing year, I'm convinced they might be right.


----------



## DSmolken (Jul 6, 2017)

jononotbono said:


> And how is this algorithm supposed to tell what people want? It can't read minds and figure out what to churn out that will be the next big thing.


Again, this part is something that's difficult for humans and possibly easier for machines. I suspect it might already be happening - not composing the next big thing, but evaluating several human-produced songs and sorting them into "dated", "current", "potential next big thing" and "useless in any era" based on data on successful tracks of the past. For a human, that's a _very_ difficult job.

Here's how I got to thinking about that. Derya Uluğ is a Turkish pop singer who at the age of 29, with no previous music credits to her name other than some bits of backing vocals, released a very successful debut single in 2016. Then for seven months, nothing. This January she drops another single, again highly successful. Two songs, several months apart, two big hits - that's such an unusual career path that it made me wonder if maybe a bunch of other potential singles were recorded and left half-finished because some powerful analysis said they don't sound like the next big thing. I might be wrong, of course. It's not like I've deeply researched her biography. Maybe she was just really lucky. But, like I said, I like to think about these kinds of things.


----------



## Rohann (Jul 6, 2017)

Replicant said:


> My entire point is best summed up in saying that we don't _need_ hard AI for an AI to be able to create music that humans can "emotionally connect" with, if you will, in the same way that we do with music created by humans.


I agree it's plausible. My personal hunch is not strongly so, but I'll agree it seems plausible especially given the market for generic drivel. I don't think it will ever replace human writing or performance however, and I'm confident in the psychology of humans to at least partially disregard this based on the pure fact that it was procedurally generated, distinguishable or otherwise.



> I also disagree that "hard AI" isn't possible given your suggestion of the existence of a "soul" which I don't personally believe exists, but is rather a consequence of complex biological processes within the brain which we don't understand and perhaps don't _need_ to understand to set the wheels of its evolution occurring in code in motion. I think this is already demonstrated on a small scale by that article someone linked about Nvidia's driverless car — but that's a completely different conversation that will go nowhere, anyway.


I don't think it will go nowhere, as I find it a fascinating subject, but I agree this isn't really the place. That said, I don't find the arguments for property dualism robust.



Nokatus said:


> Ah, that goes into the territory of religion pretty much, then.


Not necessarily. Arguments for God's purpose certainly delve into the metaphysical, but immaterial mind doesn't necessitate God. Teleology is also begrudgingly admitted by staunch atheist evolutionary biologists (i.e. Dawkins) in a very roundabout way. It was also formally argued for by Aristotle, certainly not a theist.



> There's nothing tangible to support something "magical" (not in a disparaging sense, but quite literally, something so great and of "spiritual origin" so that one could behold it as a miraculous thing) at work in the human mind; instead, there's just a belief, the one you are currently expressing, that qualia of different experience/emotion/etc states are something so intrinsically personal and human-like that an artificial consciousness couldn't experience similar things. So, in the end, it's a topic more suitable for personal beliefs and religion and such.


Again, it's not purely a matter of belief in a "blind faith" sense -- there's a significant body of highly compelling work by neuroscientists and philosophers on either side of the debate. Minimizing one side without giving it its true due isn't precisely fair. I'm happy to not have to break out books and references though, and simply to agree to let it be .



> Meanwhile, cosciousness and AI research will carry on  regardless of beliefs or opinions one way or the other.


Yup, to the complete disregard of the caution/apprehension experienced by world leading thinkers (Hawking, Musk, etc). For better or worse, I suppose. We may end up destroying this planet before AI gets that far (there's another inspiring gem).



> The thing I'm personally most pessimistic about is, the time might come surprisingly soon that there are artificial systems capable of having negative experiences -- say, on the level of animals. Attitudes like the one you're demonstrating will have an effect on our ethics toward systems like that. What rights will they have compared to biological entities?


I think that question delves even further back into metaphysical/ontological/epistemological perspective about the existence/systems of objective morality, etc. I think that will have to be delved into _if_ we get there. Fascinating topic though, no?



> The Chinese Room argument is a deeply flawed one in critiquing the AI concepts of today. It was constructed in an era when the approach to AI was the descriptive, rigid rule based algorithm kind, and it points out shortcomings in that line of thinking. However, what the future of AI looks like today is about artificial learning systems that shape and adapt their own functions and reactions through a dynamical, empirical learning model.


Mmm, in its most simplistic form, sure. Searle made a compelling defense of it later on, however, and Feser expanded on it further quite robustly to address exactly that.



AlexRuger said:


> What would lead you to believe that it's anything but?


Years in university studying neuroscience, psychology and philosophy, initially entering in with the assumption that mind is purely biological, coming out with a plethora of logical problems regarding materialism and accompanying monistic theories of mind (though I now wish I had studied more jazz theory...). I think the other side of the argument is sound and often ignored rather than examined seriously, often due to a bias in the current climate of neuroscience.
If you're at all interested, Feser's "Philosophy of Mind: A Beginner's Guide" is a rather fascinating introductory read.


----------



## Kyle Preston (Jul 7, 2017)

This is an endlessly fascinating conversation.



Replicant said:


> However, dystopia is the correct answer because any of the "optimistic" predictions have already been, time and again, proven to be infeasible; thus leaving only one option............
> 
> Even if a universal income was sustainable, it will render us all vegetables.



From one cynic to another, I think you're looking at this through too small a lens. It's easy to zoom in or out of any piece of data you want and see more dystopic outcomes. But it's far from the whole picture. And it's not black and white.

People who define (or reduce) themselves into absolutes like _work – _yeah they might feel like vegetables when that work disappears. But that's not what human beings are. Will some turn violent? Probably. But knowing a bit about Psychology goes a long way with adapting. A lot of us _already have_ our basic needs met. If you think _work_ is the last bastion keeping *all *of us from becoming vegetables....then I guess you're more cynical than I am.

And I'll go further with one of my favorite Louis C.K. quotes:
_
"When you have bacon in your mouth, it doesn't matter who's president"_


----------



## jcrosby (Jul 7, 2017)

Somehow this feels quite relevant...



(As does this...)


----------



## novaburst (Jul 7, 2017)

Replicant said:


> they'll take up arms. Crime will rise, anti-tech platforms will be a common theme o



It may happen that way who knows, but I believe in inovation, and get better, and surpass that which we used to do.

It could be doom and gloom, and kill and crime, or it could be a chance to grow even more.

I feel when we are presented with challenges it can course us to excel, on the other hand it may course us to break.

So the saying goes what did not kill me made me stronger in other words what did not kill my music made it better.

Unfortunately or maybe fortunately, the computer is taking over quite a bit of our life, when we go to the train station many times there is no ticket person there to serve you, it's all machines, cars now can be driver less, even trains are operating driverless it's all computers doing what once used to done buy humans.

So where does that leave us, do we go back to the dark and lawless ages,or do we go forward yet again,

History tells all of us that humans have continued to strive for better we have gone to the moon, we fly lumps of metal with passengers on it, we have searched the galaxy, and the deep blue sea seems like where there is a boundary it has been broken not by animals or computers, these great boundaries have been broken by humans.

And leaves me to say we only know the way forward and we will keep going forward breaking more and more boundaries.

We love to create, and we will keep doing so.


----------



## paoling (Jul 7, 2017)

The biggest current limitation of AI is that its rules are prewritten in code. There are some rules about what chord changes sound best, how to build a melody, but all these rules are selected by a man who "teaches" the software.

Machine learning is the key.
The most promising innovations in AI will come as soon as the software is able to teach himself by analyzing a collection of human made creations. There is software who can replicate the style of a certain painter in history, there's this nice QuickDraw from google:
https://quickdraw.withgoogle.com/

The patterns that the neural network creates, resemble a lot how the human brain works. It has the ability to fail, like our thinking process, but this failure rate will be reduced by the amount of input data provided to the system.

We have prewritten the rules for a calculator, but if we build a computer that's able to guess the mathematical rules from a collection of presolved equations, this machine could go on and develop and discover new mathematical things. Same things for every area of human life. Even music.


----------



## Phillip (Jul 7, 2017)

Interesting discussion. Humans can hardly write music themselves. Now they creating AI to that task for them? Very delusional


----------



## Nokatus (Jul 7, 2017)

Rohann said:


> Not necessarily. Arguments for God's purpose certainly delve into the metaphysical, but immaterial mind doesn't necessitate God. Teleology is also begrudgingly admitted by staunch atheist evolutionary biologists (i.e. Dawkins) in a very roundabout way. It was also formally argued for by Aristotle, certainly not a theist.



Religion doesn't necessitate belief in a godlike being.




Rohann said:


> Again, it's not purely a matter of belief in a "blind faith" sense -- there's a significant body of highly compelling work by neuroscientists and philosophers on either side of the debate. Minimizing one side without giving it its true due isn't precisely fair. I'm happy to not have to break out books and references though, and simply to agree to let it be .



We can agree to let it be, and you have undoubtedly went through this discussion countless of times, as have I, on and off campus, so I can predict how it's going to proceed if we continue. In the end, the compelling work you refer to isn't convincing enough to make the scientific community go "ohh, now I see, yes this is very compelling indeed" ; ). That's why it's a debate, after all, and there are lots of interesting things being debated academically, of course. In my experience, those particular arguments still always lead back to being based on the principle of "I just have this feeling there's something more, you know, and that's why I figured it might work this way", instead of some concrete findings that would make everyone take notice. It's just dressed in various levels of intellectual camouflage, intentional or not, like Searle's later defense of the Chinese Room. The defense itself may come from a very honest place, intellectually, and by that I mean Searle himself very probably really felt like he was right about what he was arguing (instead of trying to obfuscate the issue), but he is also largely missing the point of modern simulation/dynamic/etc paradigms and "updates" the Chinese Room in a way that is clearly rooted in the old data theory of artificial consciousness. 



Rohann said:


> I think that question delves even further back into metaphysical/ontological/epistemological perspective about the existence/systems of objective morality, etc. I think that will have to be delved into _if_ we get there.



It actually doesn't. Most of all, it's a practical concern. In comparison: as it's so difficult to approach artificially creating a consciousness comparable to our own, what about that of a small animal? Vertebrates are quite complex things, yes, but in the end, still consider an artificial entity comparable to a mouse, for example? A fish? Mice and fish go through emotional states like fear, pleasure, social anxiety/satisfaction and so on. At what point does the intangible non-biological component of consciousness come into the picture? (This is a rhetorical question, as I still agree we probably shouldn't pursue this discussion further  ).

In other words, if we at some point have an artificial system quite indistinguishable from, say, a common housefly, is it somehow a lesser being than its biological counterpart if there's no actual evidence, at that point, pointing out that it's indeed missing something, how ever trivial in the case of such a relatively tiny creature? Increasing the complexity from there, perhaps at some point arriving at the realm of functionality that indeed demonstrates more complex fear responses and similar, seen in small animals -- that is the point when there should be actual compelling _evidence_ that a consciousness needs that magic component of "something else" in order to be considered a "real consciousness", if you will. If there is no such evidence, continuing to support the viewpoint that systems like that are of a lesser nature, and basing ethical decisions on such viewpoints... can be quite disturbing indeed. Also because it sets a precedent of how such issues are being dealt with, going further from there.

But that's enough about that. You know my viewpoint, and I know yours . We disagree, we both are academic, and we both have our reasons for having arrived at the (vaguely off-topic) place we currently are in, haha.


----------



## Fer (Jul 7, 2017)

Nokatus said:


> We are already way past this sort of rule-based developer-defined approach. One of the biggest challenges in AI, currently, is actually understanding (after the fact) why an AI made a particular action. The actions aren't preconfigured anymore, in the fashion you describe. The paradigms these days are different than the rigid engineer established, do-this-don't-do-that, etc, algorithms were.
> 
> This is a nice read on that subject, despite the rather dramatic headline: https://www.technologyreview.com/s/604087/the-dark-secret-at-the-heart-of-ai/
> 
> Also, chess and its traditional developer driven weighting rules you are referring to... are quite old beans . Instead, the fact that a deep learning system was able to beat a go master, that's quite something actually: https://www.wired.com/2016/01/in-a-...gles-ai-beats-a-top-player-at-the-game-of-go/



Hey i readed the article and is very interesting. I was aware that chess algorithms are stone-age nowadays. I think that i understand the concept behind these chess algorithms (without knowing anything about programming) but by far i dont understand the concepts behind these new AI systems… i will search some divulgative information.

In any case i think that semantics are crucial when talking about these issues. Is true that actual chess software wins to the world chess champion easly. But in my opinion it is safe to say that the statement “chess software does not play chess” is also true; If i say that is because (to me at least) “playing chess” is an expression that implies understanding of meaning, purpose in action, the understanding of the purpose, the desire to win and much more things. Those are things that you can detect in yourself when you when you play chess, and also when you writte a post or when you think about the ultimate meaning of life….

But there is no evidence by far that such things are present in machines equiped with old school algorithms. In fact there is evidence of the contrary because as i said the algorithm could be designed in a form in which the machine asumes that “5<2” is true, and that is a mistake. The total lack of understanding, the total blindness of the dead things is a must to “asume” that its true that “5<2”. But there is more, “asuming that something is true” is an expression that makes sense only in relation to intelligent beings; only intelligent beings understand the difference between true and false; machines are not assuming anything really in the strong sense of the word. So what we really have is a dead machine working as expected.

But on the other hand those old school algorithms in fact are *mimicing* intelligent behaviours. In fact are playing moves that are way beyond powerful than the ones that a true intelligent human being can imagine. I readed some talking of Kasparov the first time that Deep Blue won him. He said that he felt some kind of true intelligent behaviour behind the machine… But this is reached with brute force calculation not through true intelligence. And we know that because old school algorithms are not a black box for us, since “we” wrotte them.

For these reasons im very sceptic with statements like “deep learing systems can recognize images”; “deep learning systems can understand words”. I think that this way of talking is using words in the wrong way, because learing, understanding, recognition, perception, are things that we have as living beings. Yes, it seems that they can be *imitated* by new intrincated AI systems. But where is the evidence that those AI systems are conscious living beings with inner live that are really recognizing and understanding words and meanings in the same way of us?

In the article it says that actual AI systems are black boxes and that the scientist can not know why they output certain results. I can deal with that. But you have to consider also that the world chess champion doesnt understand why the old school algorithm is playing this move in this moment. The “purpose of that movement” of that non intelligent machine is clarified for the human intelligence much more moves later than the human champion is capable of calculate. AI systems are black boxes designing their own methods of calculation… and its awesome really, and scary. But there must be a hardware and a software in the beginning. And some kind of human input in the beginning (like in the old school algorithms) that the machine is assuming without understanding it because is not a living being…

In any case it seems that these AI systems could be generating true quality music without being truly intelligent and without having feelings. And I welcome that! But let me add a question:

The history of music has evolved through exploring new directions; Without departuring from the “well known territories” impresionism or jazz would have never been developed. But the search of new musical possibilites is not a blind search; it has been made by musicians with good taste that can discern the aesthetical value of a never heard musical structures. They try to improve, they experiment,they asume certain ideas and they reject others, based on their taste and goals etc. So they are equiped with conciousness, taste, goals, purpose, understanding of meaning, feelings, emotive reactions to their musical experiments…They can evaluate the beauty of a harmonized melodies that breaks all the rules... Do you think that machines could be programming themselves to discern by themselves new forms of good music just analyzing what we actually consider good music? Is there something more in these new AI systems that brute force? We will see if machines will come up with new interesting music but the ultimate judge will be always the human being.


----------



## Nokatus (Jul 7, 2017)

Fer said:


> But there must be a hardware and a software in the beginning. And some kind of human input in the beginning (like in the old school algorithms) that the machine is assuming without understanding it because is not a living being…



I was equipped with biological hardware and very tangible human input  in the very beginning, and I had my genetic structure to guide my development. Then, a staggering variety of (contextual, conceptual, also of the non-verbal kind, both deliberate and undeliberate, spontaneous and so on and so on) human input. The fact that there was a learning process going on doesn't mean I understood the "underlying algorithms" while learning to talk, for example. At all. Back then, it just happened.

Then, more learning, unintentionally and intentionally, by experiencing everything around me, and bumping into new things, sometimes literally. At some point I learned about learning, and learned about more abstract concepts, little by little. The current artificial learning systems aren't nearly this sophisticated, hah, but surely you can see the point? When I was two, I surely didn't have an actual understanding of language, but I was learning it all the same, and I doubt anyone would have seriously challenged whether I had a consciousness or not. Alas, little by little I started reflecting on things like this at quite an early age. (Might have something to do with how I started reading when I was three. I have fond memories of what that particular learning process felt like  ).

The interesting stuff in the artificial learning field is just beginning. The more we move into more complex, and at the same time _more generic_ learning systems (in the sense that you can teach them things in ^ this manner, starting out by just coping with inputs and operating in the world, with enough power and plasticity to deal with it all, and then proceeding to learn more intricate skills on top of that, instead of starting out on specific sets of problems and data), the closer we come to the moment when we have something that, at the same time, appears conscious but is also artificial and somehow alien to our understanding. Chances are, even as the end result will be far from our own conscious faculties, at least for a good while, we can't just easily "open it" and see why it appears to be the way it is, any more than we could just "open" our brains and easily find out what makes us tick.

At that point, if you want to believe in a "non-biological" or even "non-physical" component of consciousness, you can then point at your brain and go: "But it's just a brain, it needs more to really feel these things", and then someone points at the artificial construct and goes "but it's just an artificial construct, it needs more to really feel these things", and maybe, as a thought experiment, you can then stare at both of those conundrums that appear conscious even as they sort-of-shouldn't, and assume that there is a non-physical element at the core of that conscious experience, and both of those constructs (the biological AND the artificial one) are somehow invoking that same elusive property.


----------



## Rohann (Jul 7, 2017)

Nokatus said:


> Religion doesn't necessitate belief in a godlike being.


True, but slapping a "religious" tag on the side of immaterial consciousness reads as a straw-man attempting to undermine the arguments for it and evidence for it.



> In the end, the compelling work you refer to isn't convincing enough to make the scientific community go "ohh, now I see, yes this is very compelling indeed" ; ). That's why it's a debate, after all, and there are lots of interesting things being debated academically, of course. In my experience, those particular arguments still always lead back to being based on the principle of "I just have this feeling there's something more, you know, and that's why I figured it might work this way", instead of some concrete findings that would make everyone take notice. It's just dressed in various levels of intellectual camouflage, intentional or not, like Searle's later defense of the Chinese Room. The defense itself may come from a very honest place, intellectually, and by that I mean Searle himself very probably really felt like he was right about what he was arguing (instead of trying to obfuscate the issue), but he is also largely missing the point of modern simulation/dynamic/etc paradigms and "updates" the Chinese Room in a way that is clearly rooted in the old data theory of artificial consciousness.


The "scientific community" (in quotes because there are obviously scientists on the other side) is also routinely guilty of materialistic bias that comes from speaking far out of turn (i.e. dismissing things their field doesn't deal with out of hand despite not having any authority to do so, ala scientism, which is a logically indefensible viewpoint). Many of these thinkers have a bias that commits terribly basic logical fallacies, as they fail to see the necessity for reason as a precursor to science's success. The Chinese Room may be criticized but there are highly rigorous arguments coming from neuroscience as to why there's also no evidence for the complete system we call "mind", and logical problems with it to begin with. The frustrating part is how often I've heard them reduced to "it's just a feeling, not science", which ironically fails to address any of these arguments on any serious level and largely relies on an appeal to nonexistent evidence (we'll find out in the future because neuroscience) and the ignoring of glaring logical issues in positivism/scientism/materialism. I can agree to disagree and let it be as this is an extensive academic discussion, but scientism is too often peddled as a robust worldview, and I have a problem with subtle language being used to undermine highly rigorous academic arguments that very much take neuroscientific data into account (also seeing as it's what I spent a good chunk of my uni time in).

Of course, I hope none of this comes across as personal as none of that is my intent.


> It actually doesn't. Most of all, it's a practical concern. In comparison: as it's so difficult to approach artificially creating a consciousness comparable to our own, what about that of a small animal? Vertebrates are quite complex things, yes, but in the end, still consider an artificial entity comparable to a mouse, for example? A fish? Mice and fish go through emotional states like fear, pleasure, social anxiety/satisfaction and so on. At what point does the intangible non-biological component of consciousness come into the picture? (This is a rhetorical question, as I still agree we probably shouldn't pursue this discussion further  ).


As tempting as ethical discussion is, I'll agree to disagree and let this one go 




> But that's enough about that. You know my viewpoint, and I know yours . We disagree, we both are academic, and we both have our reasons for having arrived at the (vaguely off-topic) place we currently are in, haha.


Yeah I should really get back to learning my musical craft so I'm not easily replaceable .


----------



## Fer (Jul 7, 2017)

Nokatus said:


> I was equipped with biological hardware and very tangible human input


Well we are pushing the thread to the most philosophical derivation of the issue…. which for some reason makes me feel very bad :D but I belive that a living being is not a machine and that there is an abyss between machines and human beings. I belive also that there are very strong arguments to sustain this view. But who knows… perhaps i will change my opinion if someday i see a AI system designing another AI system which outputs correct results through misterious procedures that “he” cant understand; and that “he” is doing that just for fun, or just because he has the desire to reach the level of his creator, the human being 

Now i need popcorns while i finish my lecture of this thread ...


----------



## Nokatus (Jul 7, 2017)

Rohann said:


> I have a problem with subtle language being used to undermine highly rigorous academic arguments that very much take neuroscientific data into account (also seeing as it's what I spent a good chunk of my uni time in).



I have a problem with literally the same thing often when dealing with the viewpoint you are representing . So it's better to call it quits, as this is not the time or the place. Wishing you well!


----------



## Rohann (Jul 7, 2017)

Nokatus said:


> I have a problem with literally the same thing often when dealing with the viewpoint you are representing . So it's better to call it quits, as this is not the time or the place. Wishing you well!


Fair. As I do you! Thanks for the civil discussion; I appreciate that this hasn't devolved into ad-hominem slinging from any party present in this thread. It happens disappointingly often in internet debates.

I should really get back to watching all those Verta classes I bought...


----------



## NoamL (Jul 7, 2017)

Here is an interesting article about machine learning...

https://blog.openai.com/unsupervised-sentiment-neuron/

The researchers set up a neural network that was trained on 80 million Amazon reviews. The goal was to train the network to predict, letter by letter, what the next character in a review would be. They were able to do this successfully.

Here is the twist. They tried to take this AI and train it to do a second task - judging the sentiment (positive/negative) of a review. But they found the AI barely needed to be trained at all - much less training than a "blank slate." When they looked inside the letter-predicting neural network, they discovered the AI had already developed a single neuron that was turning on/off in a way that was highly associated with sentiment.

Here is how that neuron was "reading" one Amazon review:







This has some pretty fascinating/creepy implications.


First, that the AI "taught itself" something that nobody was trying to teach it (or at least came very close to that, such that minimal additional training was required).

Second, that the AI discovered a relationship that the scientists didn't anticipate and still don't fully understand. If you gave someone the task of programming by hand an AI that predicts the next letter of a text chunk, they wouldn't immediately think of including a sentiment analysis subroutine, but it turns out that including that subroutine DOES give you an advantage... somehow. Maybe negative reviews are more repetitive or contain a narrower vocabulary? (think of Trump tweets: everything is "sad" and "the worst!") The researchers still don't know WHY the AI grew that neuron, which in a sense makes the AI already smarter than the researchers.

And third, creepiest of all, is imagine what happens when this scales up. Maybe in the future some AI researchers are training a neural network to achieve some more advanced task and the AI, in its evolutionary search to grow a network to achieve that task, accidentally grows enough neurons in a specific pattern to be able to feel pain. Or hostility.


----------



## Kyle Preston (Jul 7, 2017)

Rohann said:


> The "scientific community" (in quotes because there are obviously scientists on the other side) is also routinely guilty of materialistic bias that comes from speaking far out of turn (i.e. dismissing things their field doesn't deal with out of hand despite not having any authority to do so, ala scientism, which is a logically indefensible viewpoint). Many of these thinkers have a bias that commits terribly basic logical fallacies, as they fail to see the necessity for reason as a precursor to science's success.



I've found so many of your points in this thread illuminating @Rohann. I love philosophy and the humanities (and Neuroscience for that matter). Ultimately, I studied Physics & Astronomy in school rather than philosophy (thank God I didn't study music at Uni). If you'll allow me to play devil's advocate, science itself is founded on the principal of rejecting arguments from authority. Go out there and test it, remove your human bias...etc

Now I'll admit that my professors were quick to ignore the philosophical underpinnings of some of their scientific arguments (I think most professional scientists aren't interested or are completely oblivious to philosophy). But, we've arrived at a time where discovering truth (at least cold hard scientific truth) can't really be obtained through introspection with pen and paper. We're far beyond that. And I've heard so many philosophers criticize scientists for 'speaking out of turn' or speaking 'without authority' when in reality, these philosophers aren't testing anything tangible, they're just regurgitating arguments from other philosophers who were regurgitating arguments from other philosophers etc...

I'm just imagining a scientist that hears philosopher say "Shut up, you have no authority. I've been reading and writing about this topic for years" and the scientist is like, "fine, piss off I'll get to the bottom of this with a microscope, laptop and some tangible _material_ evidence". Scientists don't respect arguments from authority because arguments from authority don't deserve to be respected.

p.s. Didn't mean to pour vinegar on the conversation. If we're all tired of this aspect of the topic, I'll happily shut up .


----------



## NoamL (Jul 7, 2017)

Kyle and Rohann, you're both right. Science is great but we need to be *very* wary of scientism.

Michio Kaku is a good example of scientists not staying in their lane. Dr. Kaku knows a little of nothing about everything. Sarcasm aside, there is no doubt he's a brilliant and qualified _physicist_ but he leans hard into the idea that being a scientist lets you talk about everything (like some other science popularizers, such as Neil DeGrasse Tyson). Here is Dr. Kaku at his finest:



This is pretty close to 100% flapdoodle... I have an evo bio undergrad degree and even I can see why this doesn't work!

It gets the physics right, but is completely wrong about anatomy and evo bio... and just bewildering on the philosophical stakes, and it mixes all four subjects into this strange mishmash of fleeting thoughts and overcertain conclusions that makes "science" sound like "whoever has the coolest most interdisciplinary and blogworthy idea wins".

Actual biologists have been interested in quantum effects on neurons. All chemistry involves quantum of course. But a neuron firing is a huge event, chemically speaking. The axons (the "wires" of your nervous system, in the electricity metaphor) can be millimeters wide and charge propagates by opening zillions of chemical gates and letting sodium ions into the cell. Quantum effects are like, one electron tunneling across a membrane. So this is like saying a particularly ferocious basketball game can alter the Earth's orbit - cuz of all the dribbling, you know...

So that's how he gets the anatomy wrong. It's a huge scale mismatch. The time scale is off too. Neuro takes place on the order of milliseconds (depends on the length of the axon) while quantum effects are on the order of picoseconds.

And the evo bio is worse. An axon can either fire on or stay off. All that introducing quantum does is add uncertainty. You can see how this reduces fitness. Imagine a single neuron in a cockroach-like creature - it runs from a light sensor to a muscle fiber, and makes the roach scurry away when it senses light. Now introduce a mutation that makes the axon narrow enough for quantum tunneling to be impactful. So we've introduced uncertainty - a chance for false positives (misfires) and false negatives (inaction). The result is a decline in fitness (an increase in getting eaten!). The moment you get a mutation that makes the axon wide enough to de-fuzz its firing, that version is going to predominate. Newton's cockroach beats Schrodinger's every time! Now maybe there is some inscrutable and unanticipated advantage to indeterminacy in very complex neural systems, but Kaku doesn't make an argument for this and seems to not even acknowledge that natural selection exists.

And then finally when he goes from physics to biochem, and from biochem to evo bio, and from evo bio to philosophy, he fails (or at least is incoherent?!) at this stage too. Like it seems the thesis he is roughly making is "deterministic world = no free will; indeterminism = free will." But this is inane, a system that has too many unknowns to predict isn't the same thing as having will. For example we don't say that dice or hurricanes have free will just by virtue of their inscrutability. Even if dice were inscrutable by pure theory (like, you know all the Newtonian variables about a dice's position, velocity, rotation, but there's some magic Schrodinger-analogous variable that makes it inscrutable anyway) we still wouldn't say the dice have will - just that they're arbitrary. I'm too stupid to really understand the free will debate (good arguments on both sides) but I know for sure that a physicist saying "Heisenberg uncertainty. There, solved it for ya" is talking wayyy out of his lane. The problem for Kaku's "big think" here is that quantum mechanics isn't a people thing. There are exactly as many quantum-tunneling effects happening in the nearest rock as there are in your brain...


----------



## AdamAlake (Jul 7, 2017)

Fer said:


> Hey i readed the article and is very interesting. I was aware that chess algorithms are stone-age nowadays. I think that i understand the concept behind these chess algorithms (without knowing anything about programming) but by far i dont understand the concepts behind these new AI systems… i will search some divulgative information.
> 
> In any case i think that semantics are crucial when talking about these issues. Is true that actual chess software wins to the world chess champion easly. But in my opinion it is safe to say that the statement “chess software does not play chess” is also true; If i say that is because (to me at least) “playing chess” is an expression that implies understanding of meaning, purpose in action, the understanding of the purpose, the desire to win and much more things. Those are things that you can detect in yourself when you when you play chess, and also when you writte a post or when you think about the ultimate meaning of life….
> 
> ...



What proof do I have that you are a truly conscious being? I don't.


----------



## Replicant (Jul 7, 2017)

Phillip said:


> Humans can hardly write music themselves.


----------



## Kyle Preston (Jul 7, 2017)

NoamL said:


> Science is great but we need to be *very* wary of scientism.



Couldn't agree more Noam! I get the sense that scientists like Krauss, Dawkins, etc.. don't even realize their own assumptions. And that *is* a problem. Disregarding other fields of study breads scientism and personally, I get secret joy-gasms when I hear thinkers from other disciplines poke fun at that ivory-tower scientific snobbishness. I'm not sure I'd include Tyson in that group though, he seems to really cherish the value in public scientific literacy (I'm biased though, as my expensive piece of paper, the source of all my debt, came from the Astronomy & Astrophysics depts). Krauss and Dawkins are more confrontational about it, which...whatever...neat.

I listened to the Michio video (without reading the title) and honestly, I'm pretty sure he's just summarizing (clunkily) physics from Newton to Einstein to Heisenberg. When Einstein and co were presenting ideas, they assumed these principles dictated EVERYTHING, which like you said, they clearly don't, quantum especially because we know now that it deals _only_ with the very small (maybe this is typical physicist arrogance). But then I read the title of the video, which is just classic click-bait bs. Which describes so many of the _Big Think_ videos. Which makes physicists look silly. I'm blaming _Big Think_ on this.

Anyway, I completely agree with every part of your post. Especially the false equivalence drawn between randomness and consciousness. I think (like you said) Dr. Kaku is assuming that if the world is deterministic, true consciousness and free will can't exist. And I'm not sure why this assumption is wrong. Though I try to remember the world is always weirder than we anticipate. It'd be amazing to discover that free will exists in a deterministic universe.


----------



## NoamL (Jul 7, 2017)

Yeah Dawkins has become pretty bad, almost a Twitter pundit at this point, which is sad as he is a legit genius in the field of evolutionary biology.

Tyson is a good guy and always stresses the value of admitting "I don't know" as the first step before "Let's go find out."

The problem I have with Tyson goes back more to his mentor/inspiration, Sagan, which is this idea that atheists should weaponize science into a pseudophilosophical way to shoot back at the religious nonsequitur arguments about how "Atheists don't have values / feel awe / can explain beauty." To which Sagan replies: "Hey, but didja know your body is made of STARSTUFF forged in SUPERNOVAS? And your brain is the Universe thinking about itself! Wow!" Science doesn't need to be a surrogate religion to be true. Sam Harris is really bad at this too, he thinks that scientific truths can be the entire basis for ethics. Like naive trolley problem utilitarianism but unironically.


----------



## jononotbono (Jul 7, 2017)

AdamAlake said:


> What proof do I have that you are a truly conscious being? I don't.



I'm pretty conscious at the minute but to be honest I've only had one beer.


----------



## MacTomBie (Jul 7, 2017)

What an interesting discussion! My take on the subject is that algorithmic composition will never replace the best composers (unless we’re talking about real Big AI), but frankly speaking I’m surprised that today’s software struggles imitating even the mediocre ones. People create a lot of crappy, derivative music that is nonetheless used commercially and with today’s state of AI, computers should already be able to do better than that.

In fact, this was the main idea behind my master thesis about 10 years ago. I created an app that was supposed to prove that one could encode basic composition rules into an algorithm that could rival some real composers (beginners and crappy ones). Well I graduated, so I guess it worked  You can find some examples here and even download the app and try it for yourself (hope it still works). It’s not as impressive now as it was 10 years back, but I think it’s still pretty cool. I even licensed the algorithms to one of the major DAW developers (don’t remember if I can name them, would have to look into the NDA) but I don’t know if it ended up in any of their software.

Anyway, I guess the main reason this field is not progressing as fast as others, is that the intersection between top AI programmers and top composers is very, very small. And you really need to have a good grasp of both to write such software, even if you take the shortcut route and use machine learning. But I’m sure will get there soon, as there is money in it.


----------



## jononotbono (Jul 7, 2017)

MacTomBie said:


> What an interesting discussion! My take on the subject is that algorithmic composition will never replace the best composers (unless we’re talking about real Big AI), but frankly speaking I’m surprised that today’s software struggles imitating even the mediocre ones. People create a lot of crappy, derivative music that is nonetheless used commercially and with today’s state of AI, computers should already be able to do better than that.
> 
> In fact, this was the main idea behind my master thesis about 10 years ago. I created an app that was supposed to prove that one could encode basic composition rules into an algorithm that could rival some real composers (beginners and crappy ones). Well I graduated, so I guess it worked  You can find some examples here and even download the app and try it for yourself (hope it still works). It’s not as impressive now as it was 10 years back, but I think it’s still pretty cool. I even licensed the algorithms to one of the major DAW developers (don’t remember if I can name them, would have to look into the NDA) but I don’t know if it ended up in any of their software.
> 
> Anyway, I guess the main reason this field is not progressing as fast as others, is that the intersection between top AI programmers and top composers is very, very small. And you really need to have a good grasp of both to write such software, even if you take the shortcut route and use machine learning. But I’m sure will get there soon, as there is money in it.



Exactly. You've ultimately summarised the whole thread with your post in my opinion. AI will never, yes I said the word never because opinions are human and AI will never have an opinion, will never be able to patch together music that will rival the best music actually written by humans. The drivel (from all genres and musical worlds) will be replicated eventually and serve as the fast food stuff that will be perfectly adequate for its purpose.


----------



## Rohann (Jul 7, 2017)

Kyle Preston said:


> I've found so many of your points in this thread illuminating @Rohann. I love philosophy and the humanities (and Neuroscience for that matter). Ultimately, I studied Physics & Astronomy in school rather than philosophy (thank God I didn't study music at Uni). If you'll allow me to play devil's advocate, science itself is founded on the principal of rejecting arguments from authority. Go out there and test it, remove your human bias...etc


Glad to hear that, thank you! I've been sick the past few days with a foggy head so I'm glad it was at least half-coherent.
Physics and Astronomy are two fields I only dabbled in in undergrad, but I would have loved to continue studying them, particularly Astronomy. A solid background of physics would certainly be advantageous in philosophy as well.



> Now I'll admit that my professors were quick to ignore the philosophical underpinnings of some of their scientific arguments (I think most professional scientists aren't interested or are completely oblivious to philosophy). But, we've arrived at a time where discovering truth (at least cold hard scientific truth) can't really be obtained through introspection with pen and paper. We're far beyond that. And I've heard so many philosophers criticize scientists for 'speaking out of turn' or speaking 'without authority' when in reality, these philosophers aren't testing anything tangible, they're just regurgitating arguments from other philosophers who were regurgitating arguments from other philosophers etc...
> 
> I'm just imagining a scientist that hears philosopher say "Shut up, you have no authority. I've been reading and writing about this topic for years" and the scientist is like, "fine, piss off I'll get to the bottom of this with a microscope, laptop and some tangible _material_ evidence". Scientists don't respect arguments from authority because arguments from authority don't deserve to be respected.


I think NoamL summed this up quite well. I agree -- fallacious appeals to authority ("you don't have X degree so you can't comment", etc) are simply that: fallacious. However, as NoamL stated, _scientism_ is the issue, namely the idea that scientific (read: empirical) evidence is the only source of valid knowledge. The irony with this view is that _reason _(logical argument) is necessary for science to be valid, as well as necessary to defend scientism, and the minute one attempts to defend scientism, one instantly refutes it. My issue isn't scientists speaking from within their field, or even when they dabble outside their field; it's when they fallaciously appeal to their own authority (as scientists) as some sort of ivory-tower intellectual authority (as stated previously) allowing them to ignore or dismiss opposing arguments (often outside their field) _without actually addressing them_. Again, it's not that they aren't _allowed_ to comment or delve into other fields, it's when they think they can simply ignore other fields of study or believe their expertise in their own field overshadows others, which is very basically flawed thinking (and to undermine the importance of sound reason in thinking is to undermine science as a whole -- difficult not to point out the circularity here). I.e. just because a philosopher is presenting an argument posed in the past, doesn't at all invalidate it, and to assume so out of hand is fallacious. Furthermore, while there are philosophers that do commit elementary fallacies as well, there are also many (like Feser, who I've mentioned a few times) who did the due diligence of educating himself appropriately in neuroscience to be able to answer objections in an informed matter.
Dawkins is a fantastic example of someone making ridiculous fallacious leaps on matters of metaphysics and ontology that won't fly in philosophy 101 (seriously -- him and Dennett were considered "too easy targets" and are both criticized by their own colleagues for their lack of rigour). I can't comment as to Dawkins' work in evobio (from what I understand he's quite well respected), but when it comes to arguments about the nature of the universe, many of his arguments are YouTube-comment level. Dennett does the same thing on occasion.

As an illustration, a concrete example would be (this is highly simplified) Dawkins' belief that God is _proven_ to not exist because evolution. Regardless of one's worldview, that's a leap far out of his field, and a conclusion his argument doesn't reach in the least. Similarly, it's sort of like a layman (me) reading up on the double-slit experiment and concluding that consciousness is an active force that manipulates all matter based on a singular study -- I don't have the background to understand this field thoroughly, and nor is my argument itself robust enough to posit that honestly.

On the other hand, I personally know science grads (one prominent friend that pops into mind has a PhD in physical chemistry) who criticize other scientists for their scientistic bias, and subsequently recognize that certain questions are outside their field (and outside science, in some cases -- not that science isn't involved in metaphysical debates, but that it's not the ultimate authority). Rather than ignore them, they seek to educate themselves outside their field, and thereby don't speak out of turn.

(_Edit: this makes me think of attitudes common in the western medical system -- how doctors tend to be revered for their knowledge on *every* health topic, such as nutrition and injury prevention, despite spending mere weeks on those topics in med school [if that] and blatantly disregarding scientific research in their recommendations_).

I think it's why, in some cases, I respect some forms of functionalism more -- it's at least an admittance that certain questions, i.e. in neuroscience, are outside the breadth of their scope, and they simply choose not to focus on them and focus instead on their field. An unwillingness to engage in outside topics I can respect.


> p.s. Didn't mean to pour vinegar on the conversation. If we're all tired of this aspect of the topic, I'll happily shut up


Not at all! I think it's an important clarification. TL;DR -- don't make leaps out of one's own field if one is ignorant about where one will land, and recognize that sound reason is the most basic and necessary principle to exercise in any field I suppose are my main points.



NoamL said:


> Kyle and Rohann, you're both right. Science is great but we need to be *very* wary of scientism.
> 
> Michio Kaku is a good example of scientists not staying in their lane. Dr. Kaku knows a little of nothing about everything. Sarcasm aside, there is no doubt he's a brilliant and qualified _physicist_ but he leans hard into the idea that being a scientist lets you talk about everything (like some other science popularizers, such as Neil DeGrasse Tyson). Here is Dr. Kaku at his finest:
> 
> ...



Well said. "Big Think", in general, features some pretty dubious perspectives (Dennett comes to mind as well), and their own bias is made clear by many of the topics presented.



> Yeah Dawkins has become pretty bad, almost a Twitter pundit at this point, which is sad as he is a legit genius in the field of evolutionary biology.


Yeah he's somewhat comical at this point. I think he's at least had the sense to quit making a fool of himself in debates; he's been pretty heavily criticized by philosophers who share the same worldview as him.


----------



## NoamL (Jul 7, 2017)

Woah, Dennett is a lot better than you give him credit for Rohann.

I think half of the backlash to people like Dennett and Patricia Churchland is because if they're right the party's over. Most of philosophy historically turns out to be arguments about folk-psych concepts with no real referents in the brain. It doesn't mean the death of philosophy in the smug Sam Harrisian sense but it does mean that philosophical _literacy_ turns out to be less useful going forward because it all turns out to be a bunch of people with no concrete brain-knowledge exercising their intuitions. Just like today nobody respects Thomas's natural theology (well, except Thomists) because they understand perfectly well that St. Thomas didn't have any real, empirical knowledge in cosmology.

Of course, some of the whatever-the-opposite-of-backlash-is?... bandwagonning? Some of the bandwagonning on Dennett and Churchland is by atheists who just don't want to think about philosophy at all.


----------



## Rohann (Jul 7, 2017)

NoamL said:


> Woah, Dennett is a lot better than you give him credit for Rohann.
> 
> I think half of the backlash to people like Dennett and Patricia Churchland is because if they're right the party's over. Most of philosophy historically turns out to be arguments about folk-psych concepts with no real referents in the brain. It doesn't mean the death of philosophy in the smug Sam Harrisian sense but it does mean that philosophical _literacy_ ("I evaluate Dennett in light of Grice's reading of Quine's reading of Kant") turns out to be less useful going forward because it all turns out to be a bunch of people with no concrete brain-knowledge exercising their intuitions.


Sorry, I shouldn't put him in the same category as Dawkins.

But he does argue in a circular manner and still has been criticized by members of his own field for a lack of robustness. There are philosophers of mind now who are more than qualified to include neuroscience in their arguments, and Dennett does little to address any of this; in many cases he dismisses it out of hand. He's far better than Dawkins, sure. But again, arguing that philosophy is incapable of speaking about matters of the mind now because neuroscience amounts to a lot of neurobabble on one side of the argument. There are quite interesting arguments on that side, but it honestly took some digging for me to find them. I've heard/read some pretty terrible arguments stemming from this kind of thinking.


> In the same way that medieval Thomistic natural theology got blown away by the first real evo bio and cosmology.


In what sense? Many of his arguments still hold a great deal of weight in metaphysical debates, as far as I've witnessed. Dismissing his metaphysical/ontological arguments because of an appeal to evolution and modern cosmology is to grossly misunderstand them.


----------



## NoamL (Jul 7, 2017)

LOL well now that we know you like Christian apologetics I think it's a little clearer why you've been arguing for an immaterial mind and saying scientists have "materialistic bias." 

Do you also credit Plantinga's EAAN? Or his theodicy? Just curious how far this rabbit hole extends...


----------



## dpasdernick (Jul 7, 2017)

This is the author I was referring to in my post way above:

Yuvah Noel Harari. Sounds like an interesting read. He forecasts billions of useless people (I work with a lot of them already). Kids not knowing what to study in college because whatever they learn will be obsolete sooner than later. Maybe the only thing that will save mankind is if the power goes out.. and since I'm a drummer by trade, I will rule the planet one paradiddle at a time. 


https://www.theguardian.com/technol...n-humans-life-useless-artificial-intelligence


----------



## NoamL (Jul 7, 2017)

Yeah but he's another "interdisciplinary surfer dude" guy. Like Kaku. Always be suspicious of interdisciplinary surfer dude guys. Few things are less rigorous than a fascinating idea...

His big idea in Sapiens is that wheat domesticated humans.


----------



## Rohann (Jul 7, 2017)

NoamL said:


> LOL well now that we know you like Christian apologetics I think it's a little clearer why you've been arguing for an immaterial mind and saying scientists have "materialistic bias."
> 
> Do you also credit Plantinga's EAAN? Or his theodicy? Just curious how far this rabbit hole extends...


Interesting assumption. I have yet to present my worldview here -- just as I admit the strength or interest of certain arguments on one side, so will I admit them on the other side.
Regardless of my personal beliefs and conclusions, I appreciate intellectually honest discussion, not prejudicial categorization and subtly pejorative language. I've witnessed enough scoffing at various elements of certain arguments (i.e. the cosmological argument, which is typically grossly straw-manned, often by people on either side of the debate [not to mention that this argument is quite obviously in no way a defense of Christianity]) _without any robust, interesting or remotely intelligent_ rebuttals to see this bias demonstrated rather clearly, again regardless of my own personal beliefs (and there are considerable atheist thinkers who agree, i.e. Quentin Smith).

Re: Plantinga. I thought it was an interesting argument when I came across it, but I don't think it's _conclusive,_ at least insofar as I've understood it. I think there's obvious evidence that truth-detection is favourable to survival in many situations.


----------



## Kyle Preston (Jul 7, 2017)

So many ideas to chew on in here. On a lighter note, I'm happy to have found someone who dislikes Sam Harrisian arguments (are they even arguments?) as much as I. 

Saw him on Bill Maher recently – I can't believe people keep buying his books – his rationalizations are absurd.


----------



## Rohann (Jul 7, 2017)

Kyle Preston said:


> So many ideas to chew on in here. On a lighter note, I'm happy to have found someone who dislikes Sam Harrisian arguments (are they even arguments?) as much as I.
> 
> Saw him on Bill Maher recently – I can't believe people keep buying his books – his rationalizations are absurd.


The popular thinkers tend not to be the truly interesting or robust ones. Again, Dawkins is a glaring example in that regard.


----------



## AllanH (Jul 7, 2017)

NoamL said:


> Interesting discussion. I can speak to it with *a little* bit of experience because I developed my own music AI, called Hyperion...



Thank you @NoamL for taking the time to write this out. I've never had the time to try deep learning on music, so it was fun to see a summary of your experiences. 

The modern DL algorithms are "frighteningly good" at what they do, so there there is no doubt in my mind that something useful will come of projects like this at some point relatively soon.


----------



## Rohann (Jul 7, 2017)

AllanH said:


> Thank you @NoamL for taking the time to write this out. I've never had the time to try deep learning on music, so it was fun to see a summary of your experiences.
> 
> The modern DL algorithms are "frighteningly good" at what they do, so there there is no doubt in my mind that something useful will come of projects like this at some point relatively soon.


I appreciate this as well! Interesting to see a practical example.


----------



## dpasdernick (Jul 7, 2017)

Eric G said:


> You called it. Go to the bottom of https://www.ampermusic.com/
> 
> There is a Premier Plugin that does just that called the Adobe® Premiere Pro® Amper Panel.
> 
> Not saying that Amper is good enough at all but there is $5M in funding that means they are going to be around for a while trying to make it happen.




The guy in this video bugs me. He cheapens and diminishes the craft by telling me he's enabling millions of people to express themselves without spending money on equipment and years to learn an instrument. Why not write an algorithm that edits the video as well and do away with the human experience altogether? Heck, why even make new music or videos? Between the Beatles and John Williams every decent melody has been written. 

Imagine a world in which everyone can press a key on a laptop and the ET theme pops out. Everyone is special so now no one is special. Will AI make us all equal? Will Hans Zimmer become obsolete because I bought the Hans Zimmer plug-in for Cubase 16?

Perhaps the power will go out and the guy who knows how to pluck a chicken will be the new rock star. Interesting times gents. Interesting times.


----------



## Rohann (Jul 7, 2017)

dpasdernick said:


> The guy in this video bugs me. He cheapens and diminishes the craft by telling me he's enabling millions of people to express themselves without spending money on equipment and years to learn an instrument. Why not write an algorithm that edits the video as well and do away with the human experience altogether? Heck, why even make new music or videos? Between the Beatles and John Williams every decent melody has been written.
> 
> Imagine a world in which everyone can press a key on a laptop and the ET theme pops out. Everyone is special so now no one is special. Will AI make us all equal? Will Hans Zimmer become obsolete because I bought the Hans Zimmer plug-in for Cubase 16?
> 
> Perhaps the power will go out and the guy who knows how to pluck a chicken will be the new rock star. Interesting times gents. Interesting times.


I appreciate dreaming and all, but that video translates roughly to: "Amper helps people with no musical training or creativity feel creative by allowing AI to be creative for them, thereby diminishing any reward whatsoever one gets through creative expression, since having something else creatively express something for you is oxymoronic."
Collaboration? Sure. But some of these ideas are not only logically contradictory; they make lofty promises (unique music at the touch of a button) that can't be delivered (yet anyway -- that remains to be seen).

It reminds me of promises in videogames for amazing procedurally generated worlds to explore (infinite ones!), only to present this in actuality:


----------



## novaburst (Jul 8, 2017)

Well one thing is for sure when these type of threads are posted, you can really see the book writers coming out, some interesting novels here.


----------



## Nokatus (Jul 8, 2017)

Fer said:


> But who knows… perhaps i will change my opinion if someday i see a AI system designing another AI system which outputs correct results through misterious procedures that “he” cant understand; and that “he” is doing that just for fun, or just because he has the desire to reach the level of his creator, the human being



It is exactly these kinds of events that, if/when they happen, will demonstrate these matters in practice, regardless of the years of debate on the supposed properties of the mind and how it perhaps can or can't be artificially attained. You don't even have to go as far as wait for a system that has markedly human characteristics, in all such regards you imply. Instead, it's the moment if/when there's actually a system that is conscious enough to describe its own consciousness, the way it understands its own being, and tries to get into grips with its own awareness, demonstrating the use and understanding of abstract concepts, learning new ones and so on. After that, if you call for an "unknown element" in a debate of the mind, and maintain that something intangible (in the sense of "not successfully arising from what can be physically constructed and produced in a model running on/as that construct") is needed in order to have actual conscious activity, or more precisely, conscious activity with high-level abstract concepts, you will need extremely convincing proof instead of intellectual games, before dismissing that consciousness of a fellow being.


----------



## Fer (Jul 8, 2017)

I never expected this kind of conversation in vi-control, but im enjoying it. At the risk of being slighly academic i will try to respond... Please stop me if this is to much and i will shut up.



AdamAlake said:


> What proof do I have that you are a truly conscious being? I don't.


Because i always mark the “im not a robot” option… 

Well, i would say that to me conciousness is this thing of noticing that you are a being noticing yourself along with a big amount of things surrounding you. It reasonable to think that stones are not noticing anything similar, so stones are not conscious. That is the kind of useful stuff that i learned when i studied philosophy at the uni : )

In my opinion i dont need to prove myself my consciousness because is simply an inmediate evidence for me: even an empirical one, if you want. Proofs are only needed when statments are not evident. Descartes considered that its in fact the only solid evident truth that can not be rejected by him even when he put himself in the mood of the most sceptical attitude: _I think thus i am_… (a being who thinks).



Nokatus said:


> It is exactly these kinds of events that, if/when they happen, will demonstrate these matters in practice, regardless of the years of debate on the supposed properties of the mind and how it perhaps can or can't be artificially attained. You don't even have to go as far as wait for a system that has markedly human characteristics, in all such regards you imply. Instead, it's the moment if/when there's actually a system that is conscious enough to describe its own consciousness, the way it understands its own being, and tries to get into grips with its own awareness, demonstrating the use and understanding of abstract concepts, learning new ones and so on. After that, if you call for an "unknown element" in a debate of the mind, and maintain that something intangible (in the sense of "not successfully arising from what can be physically constructed and produced in a model running on/as that construct") is needed in order to have actual conscious activity, or more precisely, conscious activity with high-level abstract concepts, you will need extremely convincing proof instead of intellectual games, before dismissing that consciousness of a fellow being.



Now... Alan Turing designed his test in 1950: when a real person cannot discover if the written responses that he is getting to his questions are comming from an AI machine or from a real human (bladerunner stuff) we should conclude that the AI machine who in fact gave those answers *IS* truly intelligent. But why? Because the only way we conclude intelligence in others comes just from the observation of their external behaviour. So acording to Turing showing an external intelligent behaviour is equivalent to being intelligent.

And that is an absolutely bold mistake to me. I would say that our concept of intelligence is extracted in first place from conscious self observation of our own reasoning. Who can disagee with that? You discover yourself thinking and reasoning and being and you extract an intuitive concept of intelligence just from being conscious of your reasoning. You suddenly are amazed of discovering that you are more than a stone... and amazed of being able to say to yourself "im aware of myself!" Of course all our reasoning is projected in external behaviours that we call also intelligent, like for example hitting a rock with another rock or playing piano. Arisotle says that here is an animated motion that comes from the thing itself and is different from the movement of a rock falling through the hill. And, btw animated... means "with anima", (and anima in latin means *soul*, which is the word traditionally used to call that conscious entity that can develop intelligent behaviours). Anima is that thing that animals have. In our case there is something behind our intelligent external behaviours: the source of all of them. And btw i dont see any reason to avoid calling soul to that source which is discovering itself conscious in an evident way.

Now the scary part is that today we have machines showing unexpected “intelligent behaviours” in a way that the most creative science fiction writters never could have been imagining. But this is at the end just no more and no less than “external behaviour”. Now, to deduce true consciousness and true intelligence from intelligent behaviours like Turing is pretending is in fact a logical fallacy. Our true intelligence is projected onto animated intelligent behaviours. But you can not deduce from that that intelligente behaviours are implying true intelligence and consciousness in the background. You know...is not true that if A implies B then B implies A. Calculators, chess algorithms, and neural networks are showing to us suprisingly "intelligent behaviours" (altough i think that they should not be called in that way, because i prefer to reserve the word intelligence only for true intelligence).

So i think that im not the one who should be giving solid proofs. The real proofs are needed when anyone says that there is or there will be conciousness and identical intelligence to the one that we have inside of designed machines.


----------



## Fer (Jul 8, 2017)

Btw i think that it would be nice if this thread is moved to the off-topic section.


----------



## Parsifal666 (Jul 8, 2017)

Off topic, but for plenty of OCD fun (and a fascinating take on consciousness), read Hegel's "Phenomenology of Mind" some editions use the translation "Spirit").

The book lays out a whole, impressively made system pertaining to consciousness (mind the hilariously verbose writing style, of course). It can be really interesting to apply the thinking models he came up with on a wider scale, but be forewarned:

Hegel was a really interesting philosopher (and more of an influence on Schoepenhauer and Nietzsche than they'd ever admit), but also the most jargon-riddled (granted, most aforesaid jargon consisted of his own, original obfuscations). One of the hardest writers to actually read, but there are many rewards in his works.


----------



## Nokatus (Jul 8, 2017)

Fer said:


> So i think that im not the one who should be giving solid proofs. The real proofs are needed when you claims that there is or there will be conciousness and identical intelligence to the one that we have inside of designed machines.



I'm not quite talking about a human having to give solid proof of a machine being capable of having consciousness. I'm talking about the practical situation that may well arise, when the one doing the talking on its own behalf might be an artificial construct, something/someone that has the capability of being conscious and also thinking about and describing its own consciousness, pointing out what it's like being the entity that it consciously is. If/when we are at that point, at some future moment, debating with such a construct directly on whether it is truly conscious or not might be interesting. But if one wants to lead others to disregard its experience of the world and perception of itself, and describe it as somehow inferior, "less true" than those experiences you are having yourself, maybe against its own protests to the contrary even... That will need some hard evidence indeed.


----------



## DrJazz9781 (Jul 8, 2017)

Considering the drivel that drives most commercial music, AI will fill quite well. Jacob Collier
however is another story.


----------



## Rohann (Jul 8, 2017)

Nokatus said:


> I'm not quite talking about a human having to give solid proof of a machine being capable of having consciousness. I'm talking about the practical situation that may well arise, when the one doing the talking on its own behalf might be an artificial construct, something/someone that has the capability of being conscious and also thinking about and describing its own consciousness, pointing out what it's like being the entity that it consciously is. If/when we are at that point, at some future moment, debating with such a construct directly on whether it is truly conscious or not might be interesting. But if one wants to lead others to disregard its experience of the world and perception of itself, and describe it as somehow inferior, "less true" than those experiences you are having yourself, maybe against its own protests to the contrary even... That will need some hard evidence indeed.


Like it or not, this will come down to metaphysical beliefs about morality and the like. Much of the language you're using is loaded ("understanding", for instance), and the burden of proof is indeed on AI development to demonstrate that an AI can even have an "experience" remotely comparable to that term as applied to a human.
As a minor example, I've seen AI programmed to express emotions, including negative ones, as responses to input. Obviously this is not a sophisticated AI, but my point is that simply because it can "express" those things in that way does not in any way imply hard, experiential consciousness. It's hard not to read your argument as "But when it does happen, how are we going to treat it?" implying the glaring holes in neuroscience regarding the experience of consciousness and the logical problems with the concept ("intellectual games" as I believe you put it) are somehow irrelevant. Disregard them all you like, they're still there.

Now practically speaking, I think the difference between truly conscious AI and "soft" AI will be rather small, at least in terms of output. I think self-teaching AI is entirely possible without needing to experience anything in a qualitative manner (obvious), and I dread the tech obsession that considers development some sort of moral imperative that only considers possible consequences in hindsight.

**Edit: I think unless anyone is willing (nevermind interested) to delve more deeply into the arguments and evidence on either side, this element of the debate is becoming tired and redundant, and as such I'll be bowing out, as I don't feel I'm adding anything productive at this point (unless anyone would like references for other perspectives on this topic, which, believe it or not, is separate from religion, as much as some might insist otherwise). I hope none of it became personal, and I appreciate the input given from all perspectives -- there are a lot of interesting implications to think on on either side of the debate.**


----------



## AdamAlake (Jul 8, 2017)

Fer said:


> I never expected this kind of conversation in vi-control, but im enjoying it. At the risk of being slighly academic i will try to respond... Please stop me if this is to much and i will shut up.
> 
> 
> Because i always mark the “im not a robot” option…
> ...



You miss the point that you will never be able to prove this to anyone but you. For all we care you are just a mindless automaton processing outer stimuli.


----------



## Nokatus (Jul 8, 2017)

Rohann said:


> Obviously this is not a sophisticated AI, but my point is that simply because it can "express" those things in that way does not in any way imply hard, experiential consciousness.



This is, indeed, very basic.



Rohann said:


> It's hard not to read your argument as "But when it does happen, how are we going to treat it?" implying the glaring holes in neuroscience regarding the experience of consciousness and the logical problems with the concept ("intellectual games" as I believe you put it) are somehow irrelevant. Disregard them all you like, they're still there.



It's not an argument I'm making at all, just stating (the obvious) that the question will perhaps very realistically become a practical matter in the future. It is indeed a question of how we are going to treat it, no need to resort to rhetorics like "it's hard not to read your argument..." and such. These discussions are very familiar regarding the treatment of animals, there's nothing particularly new there, just the intellectual speculations and indeed trickery on the matter by Feser, for example, which would have little weight in that practical situation.

Just as it would have minimal weight if you encountered a new lifeform/culture from somewhere else entirely : ), perhaps based on a very alien structure compared to our own, and we started interacting with it, and came to the conclusion that it's a sentient being with its own motives and intrigues, pleasures and creativity, and a desire to perhaps co-operate, learn and so on. If someone said, "you'll need to prove they really are conscious, otherwise we'll treat them as not worthy of equal ethical consideration", and so forth, that would seem like a hard position to be in. And it might also be that, at the same time, someone in their culture was expressing the exact same doubts of us.

So, if we in the AI field end up in a situation where we, on the other hand, have our biology and the processes in our brain and nervous system et cetera, and then we have a different kind of construct that also says it's conscious and defends itself, and in both cases we can't _really_ say what exactly produces the conscious experience in either one, and whether the other party alien to us is just feigning true self awareness... It's a stalemate of sorts, and if the situation ever comes up, a rather humorous one.

I'm also left wondering why you are replying me directly after explicitly agreeing, in good spirits, we wouldn't be pursuing the matter? I'm replying to this message in order to point that out, but your "I'll be bowing out" comment reflects my own plans as well. Again, in the end, no matter what debates like this contain, the actual applications and implications of consciousness and AI research will keep advancing, and history will eventualy show what will happen in practice.



Rohann said:


> Much of the language you're using is loaded



Likewise


----------



## Fer (Jul 8, 2017)

AdamAlake said:


> You miss the point that you will never be able to prove this to anyone but you. For all we care you are just a mindless automaton processing outer stimuli.



Thanks for your nice words! im feeling some hostility here. Anyway i would like to ask you two questions… First: do you think you are alone in the universe and everything else is machinery? Just curious. The second one would be if you as a first person have another kind of experiences than the ones that i have and i have been trying to expose (with more or less fortune). Im sure i will never see your soul, but i dont belive that you are a machine. Honest responses are much apreciated.


----------



## Desire Inspires (Jul 8, 2017)

Polkasound said:


> It will affect the music industry in that it will allow less skilled composers to create acceptable music with the same lack of skills. It will create a surge of cheap composers, driving down the cost of music production for B movies, some commercials, and some video games.


 
Thank goodness!


----------



## AdamAlake (Jul 8, 2017)

Fer said:


> Thanks for your nice words! im feeling some hostility here. Anyway i would like to ask you two questions… First: do you think you are alone in the universe and everything else is machinery? Just curious. The second one would be if you as a first person have another kind of experiences than the ones that i have and i have been trying to expose (with more or less fortune). Im sure i will never see your soul, but i dont belive that you are a machine. Honest responses are much apreciated.



There is no hostility here nor is it about me or you personally. It is about the fact that neither you or anyone else can prove being truly conscious to each other. This is philosophy 101.


----------



## Rohann (Jul 8, 2017)

Nokatus said:


> It's not an argument I'm making at all, just stating (the obvious) that the question will perhaps very realistically become a practical matter in the future.


This is an argument, but I digress.



> It is indeed a question of how we are going to treat it, no need to resort to rhetorics like "it's hard not to read your argument..." and such. These discussions are very familiar regarding the treatment of animals, there's nothing particularly new there, just the intellectual speculations and indeed trickery on the matter by Feser, for example, which would have little weight in that practical situation.
> 
> Just as it would have minimal weight if you encountered a new lifeform/culture from somewhere else entirely : ), perhaps based on a very alien structure compared to our own, and we started interacting with it, and came to the conclusion that it's a sentient being with its own motives and intrigues, pleasures and creativity, and a desire to perhaps co-operate, learn and so on. If someone said, "you'll need to prove they really are conscious, otherwise we'll treat them as not worthy of equal ethical consideration", and so forth, that would seem like a hard position to be in. And it might also be that, at the same time, someone in their culture was expressing the exact same doubts of us.
> 
> ...


I won't go into addressing your points or defending myself here as it's not going anywhere. I originally commented because I quickly grow tired of circularity in points of view and dismissive cariacatures of opposing views, rather than honestly addressing them (such as Feser's "intellectual trickery and speculation").

However, I added the addendum at the end because I'm foolish enough to keep biting, and at this point I'm not adding anything to the discussion meaningfully or helping avoid circularity. I kept the original post out of a desire to be candid rather than pretend I didn't write it. I trust, at this point, that people still interested will read fairly into the available evidence and arguments on both sides, rather than accept rhetoric from either side. I think Feser's "Philosophy of Mind: A Beginner's Guide" is a well reasoned and readable introduction (his blog occasionally expands on some perspectives and arguments), as well as works by Nagel and Searle (even though I don't agree with them both in all things by any means, I think they're actually readable and provide many excellent points of consideration). I'm sure others can make their own recommendations (and may have already).

In conclusion, I appreciate the discussion and perspectives and genuinely do wish yourself and everyone else well. I'm supposed to be here for learning about music, after all.


----------



## Nokatus (Jul 8, 2017)

Rohann said:


> I quickly grow tired of circularity in points of view and dismissive cariacatures of opposing views, rather than honestly addressing them (such as Feser's "intellectual trickery and speculation").



Going through material like that and addressing it in detail here, all the while knowing it's dogmatic at its core, and still very elaborately constructed and work-intensive to unravel in a satisfactory manner (hence "trickery") -- as there are, frankly, no answers that conclusively satisfy both parties, to the questions/problems he underlines and then tries to answer himself -- is futile, and something for another time and place, and preferrably for someone else who hopefully gets paid to do it. I just wish, like yourself, that anyone can look past any rhetoric I or you or Feser or anyone else comes up with, and checks out for themselves what is going on, who this Feser fellow or any other author is and what their main ideas and supported ideologies on this and other matters are. Make no mistake, Feser's tone, for example, is often dismissive and caricaturizing as well, just from the other side of the ideological divide.

Again, all of this babble doesn't matter in the face of what happens in practice. Wishing you well again, sincerely, and may you make awesome music in your days as well . Take care.


----------



## Fer (Jul 8, 2017)

dpasdernick said:


> The guy in this video bugs me. He cheapens and diminishes the craft by telling me he's enabling millions of people to express themselves without spending money on equipment and years to learn an instrument. Why not write an algorithm that edits the video as well and do away with the human experience altogether? Heck, why even make new music or videos? Between the Beatles and John Williams every decent melody has been written.
> 
> Imagine a world in which everyone can press a key on a laptop and the ET theme pops out. Everyone is special so now no one is special. Will AI make us all equal?


Surely those AI music softwares will find a site in the room because they will be generating easy money, but hey: which is the different between clicking "RENDER" in AmperMusic, and clicking "play" on Winamp? That when clicking "render" you created something? LOL
As more automated becomes everything as more similar to this the world will become...

and everything will be very boring.


----------



## Fer (Jul 8, 2017)

Rohann said:


> The "scientific community" (in quotes because there are obviously scientists on the other side) is also routinely guilty of materialistic bias that comes from speaking far out of turn (i.e. dismissing things their field doesn't deal with out of hand despite not having any authority to do so, ala scientism, which is a logically indefensible viewpoint).



@Rohann Materialistic bias. You said it. They are taking the asumption that science is only science when it is explaining everything consistently through a materialistic model. And btw that is NOT a scientific assumption: altough a very philosophical one. What an irony. So if someone talks about teleology, or intelligent design, this mainstream scientific community laughts out loud an marks him as not scientific by definition without facing the challenge to respond to the arguments. After that they say, of course, that science is the opposite to myth and dogma. I would like to know the truth, not to build a consistent materialistic model.


----------

