

The materials, the inertial matrix, they're just not the same. That's because the resonate bodies of most instruments - especially stringed instruments - are shaped differently than speakers. VSTs and the gear that emulate the performance logic and physics of a guitar can get close to an acceptable reproduction of acoustic instruments but that last mile will be a hard gap to close. That breaks down into two categories - acoustic and amplified.
#Shreddage ii & ezmix full#
The more difficult nut to crack in emulating the full drive train of a modern guitar is the instrument itself. Throw down enough money and we can already get reasonable computational emulations of most popular stage electronics from the past century. When it comes to processing gear and amps, Simpsons already did it. Presuming it is at least theoretically possible to digitally document in computational language every nuance of Eddie Van Halen's performance, the other aspect on the rendering side of the best-most-real equation is the guitar - from the pick or fingers on the strings, to the resonance of the wood body, the dynamics of pickups, the amps, the effects and such other processing gear. Is it not within the scope of eventual computational science to notate the performance aspect of playing the guitar? Sans keyboard, sans human hand - if not by bedroom producers with too much time on their hands, then by AI analysis and pattern matching logic produced by overfunded grad students spending their Silicon Valley parents' fortunes, their AI guitar-licks algorithms trained by analyzing performance records and accelerometer measurements of real virtuoso guitarists performing? Won't a well trained bot eventually turn out some licks that fall within the scope of enjoyable human performance habits? If no human can or will produce such detailed documentation of existing performances, computational machines can, if not now, soon - unless the inexorable march toward AI that can pass the Turing test is more exorable that it might appear.

(Okay, the guitar player can dance and wear a costume, but for our purposes, that's not part of the equation.)Īm I missing something? Few MIDI artists can document the finest details of legato expressed by some human performers, but such nuance is within the scope of current notational languages. The player can manipulate only three parameters - the tone, velocity and duration of sounds to be generated. Let's isolate the guitar player from the instrument. Whether VSTs sound "like" a guitar becomes a computational question, a technology question, an economic question and a question of human perception. With VSTs, real-time performance capacity is not an essential element to producing amplified music. What it comes down to is chord voicings and the physical capabilities of the human hand. You are wrong and evidently don't have enough experience with guitar and keyboards to understand why.

This out of balance attack portion is very noticeable to me and in some cases, sounds great but in other cases, sounds bad. Does anybody else notice that when you plug a guitar into your audio device (no MICing), that the attack of the string played is exaggerated as compared to mic'ing the guitar through the amp. My last observation is related to AMP SIMS and audio interfaces. I have my reason to create guitar tracks using MIDI but don't know why someone without a specific objective would go down this path. Also, the MIDI guitar track can also serve as an educational tool and how a part is performed. The only advantage to this technique, is the ability to take a MIDI track, creating this way, and substitute different guitar models to audition what might sound best. Alot of work just to recreate what you can do on a real guitar. In order to trigger these notes, a MIDI guitar controller is needed. So, now to your other point, you can't do this on a traditional keyboard layout (for example, within the first 12 frets you actually can have the same note on three strings, the fingering of the fretboard determines which string in what position is playing the note. Each string becomes its own program, load the six string programs into KONTAKT and lo and behold, can recreate any chord voicing possible on a guitar fret board. Agree with you that a VSTi for guitar isn't very realistic but along those lines, I have built some sample libraries where I sample each string on every fret position.
