It looks like you're new here. If you want to get involved, click one of these buttons!
Subscribe to our Patreon, and get image uploads with no ads on the site!
Base theme by DesignModo & ported to Powered by Vanilla by Chris Ireland, modified by the "theFB" team.
Comments
I thought that too last week until it became apparent that most of the new features don't need the M-series silicon. It even works on my ageing MacBook Pro that doesn't officially support the required version of macOS.
Regarding the bass, all the bass on the 2020 Dua Lipa album Future Nostalgia is MIDI with a Bass Guitar VST and most people didn't realise. It got listed on a lot of end of year "Best basslines of the year" lists etc. So it can certainly get to the point where it's on the 3rd best selling album of the year.
I don't really like the idea of an AI coming up with bass lines as to me I'm not composing the song, but I'm happy for there to be something that takes my MIDI and adds in all the microtiming and articulations that a real bass player might do.
Key's I'm slightly more OK with and I'll already use an advanced arpeggiator to change block chords into a pattern a keyboard player might actually do.
It is undoubtedly going to kill what remains of the session music scene except for very high level guys with a specific sound and people producing library music and Film and TV are again going to be in trouble. Benn Jordan is an interesting YT guy who used to do this (Also an IDM artist) and he got out of it a few years ago for similar reason.
But either way the result is the critical thing, and most people just want “a part” rather than “the best part possible”, which of course is also why most music is bland rubbish. But either way it’ll hurt session folks for sure.
The difference is that the lines on that sound like a synth.
If you listen to Future Nostalgia it sounds like a bass player, it even pops notes etc.
It's more the fact that the AI program will create basslines far better than I can (I'm no bassist), so why would I bother to get a bassist, or bother to write an inferior bass line when I can have a better one immediately? Why would I employ a pianist to write accompaniment to my music if I can get a similar effect from the Piano AI in my DAW? Especially if I'm part of the new breed of musicians who sits at home and writes, does Youtube, doesn't play live and so on.
I agree, we will see less and less people being able to make money from it. For me, it's really convenient, but I also feel like I'm cheating. Drums are one thing, making sequences for an arpeggiator is fine, getting your DAW to write whole parts on different instruments, well it seems like something else entirely.
Drummer is great but I often use it as a template and then make changes in the piano roll to my liking for more flexibility.
What does everyone else make of this new feature? (assuming you’re on silicon / able to use it)
Just saw the prices for a new IPad pro , starting at a grand, with 2-300 for the keyboard and pencil, M4 now, staggering.
There is an open source stem splitter, who's name escapes me, that can extract 6 elements and variations, which makes spending money on this kind of stuff unnecesary, unless you are restricting yourself to the Mac OS, at this point why would you, when they are making their own hardware obsolete at such a quick cycle???
I was working on a sample based track which had one section where I didn't have any appropriate bass samples to use in one section so I wrote a simple bassline that matched that section and thought I would try and get it to sound similar to the bits of the track that did have bass guitar samples.
I was very disappointed it sounded very fake to me and no better than using the Korg M1 VST which is a late 80's rompler and thus not really "accurate" and that's after running it through the Logic Bass Amp simulator plugin.
It was also making quite dumb choices about how it was playing the notes moving up and down the neck loads and staying on the same string so that part of it doesn't seem that effective either.
I ended up using a very obvious synth patch so it didn't even sound like I was trying to make it sound like a Bass Guitar.
Early indications are it's fast and very good at what it does. Not quite as good as Moises, not least because it can only do the standard 4 splits, but also doesn't split vocals quite as cleanly on the tracks I tried. (Also because it doesn't work on my phone!)
But it's a great start and it'll be absolutely plenty for a lot of people - I expect extracting drums or vocals from an existing track will be the biggest use cases.
This is just the first riff that came into my head over a drum break I'd sampled for an Elektronauts competition. The bass is my new Squier through a Nembrini Dark Glass emulator and the guitars are all through their free Krunck (I guess Krank?) model. Probably more aggressive sounding than I'd usually go for, but still it's pretty crazy that you can knock something like that up in 2 hours on a tablet.
There is no 'H' in Aych, you know that don't you? ~ Wife
Turns out there is an H in Haych! ~ Sporky
Bit of trading feedback here.
Thank you!
It's the first time I've recorded a guitar in 7 years so it felt good to be doing some recording!