Forbidden Radical Leaps in Computing Power

Summary: Computers can automatically program themselves, debug themselves, and optimize themselves to run more than 250,000 times faster than they were 6 years prior. In fact, it's already happened.

Yes. Machines are automatically programming themselves now. And have been doing so for about 30 years.

Not well. Not completely. And not very fast.

But they can write their own code, using genetic algorithms.

It's expensive, slow, and difficult, but it's possible, cheaper, and easier than ever. Almost full-automatic.

But they're keeping the good stuff to themselves, of course. And a team of coders in Brazil proved it.

Are there any benevolent reasons our lords and masters may be keeping their wealth and power away from the masses? Of course.

Young Einon: The peasants are revolting.
Brok: They've always been revolting. But now they're rebelling.
There are also good reasons why big companies are able to do the impossible. Billions of those reasons to dry their tears at night over the world's smallest violin they play for the paupers.

And I'm sure it makes their trousers brown to know that beige Catholics in Brazil can turn their graphics processing units (GPUs) into supercomputers at the drop of a hat.

Why didn't I think of it? Well, I did. I just didn't implement it. My technical abilities were somewhat limited back before I discovered the secrets of making girls drag me home.

That must be the reason why I got distracted for about 15 prime years the good Lord set aside to make me a god of software development.

Breasts. Hmm.

Very, very distracting things. Expensive, too. Which means you'll need to be successful. And software companies are full of very successful people who can afford lots of distracting things.

If you want to be successful in 2019, you've got to be a technology company. Whether or not the world sees you that way is another thing.

Wal-mart, for instance, is a high-tech company. It's not obvious, but it's true. Robots scan the shelves to re-order sold inventory, and do a better job than the humans.

To become or join a tech company poised for success in the modern age only requires an app.

But you won't download that app unless you're a real-life rebel. The kind of bad boy the girls all want, who defies the social norms, puts on his pocket protector, slicks back his hair and throws caution to the wind!

To do whatever is necessary.

And if you're really serious about going your own way, and you're willing to zig while the world zags, you might even need to strap on a fanny pack.

Be prepared. The world is changing. It's true. And you don't have to take my word for it.

I'll even provide three key examples of the coding revolution, already in progress.

The first example is one you might be expecting. Conclusive proof that...

1) Most software developers are already obsolete. Since about 1985, really. 

Yes. I said most of them. You could have the same results with 90% fewer developers.

That's because...

a. Devs spend 90% of their time chasing bugs. They don't need to do that anymore. Bugs are so 2016. (See below)

b. 90% of devs don't know how the machine works. They can't optimize code because they don't understand the machine, don't want to, and wouldn't crack open the book to find out, not even for a million dollars in company stock. Their religion, and yes, they have one, is "the compiler already does a pretty good job of optimizing code."

This is the God they serve.

2) Machines can self-debug.

Under able and competent professionals (and management) instead of a pit of vipers looting the corporate coffers, inefficiency doesn't have to.

Maybe there was an excuse for ignorance ten or twenty years ago, but when Alpha Go beat every human player, dug a grave for them, buried them, and peed on their graves while doing victory laps and singing "We are the champions", the rest of the world woke up to the fact that computers are 10 years ahead of where they're supposed to be.

3) Programs can now self-Improve, evolve, and self-optimize to run on bare metal GPUs, writing fully-functional, bug-free instances of themselves at lightning speed. 

What? Did you think only a neural network could be continuously refined for a million generations? 

If you only knew how good things really are.

And when Deep Mind's machine beat Starcraft 2's top-ranked players, 10 out of 10 times, when Alpha Go beat the world's best player, and version 2 beat Alpha Go 100 to zero and DeepMind taught itself in a matter of hours to become super-human at beating almost every other videogame simulation put in front of it... 

Did you think it was a GAME?!

Well, technically it was a game. But it won!

"All I do is win." - Alpha Go

With computers teaching themselves how to kick ass at every single conceivable human skill, one after another after another, can you still be excused for snoozing on the job? 

Look, 30 years is a pretty long nap. 

Corporate America's doing this in public, Lord only knows what the secret government's already doing in their DARPA national security skunkworks. 

(Which is not even a thing. They hire tech contractors and award public prizes that pushes technology forward. The tech is not even slightly a state secret. Or is that just what they WANT YOU TO THINK?!!?!?!?!)

I'm sure the Feds were cataloguing all our information on microfische and magnetic tape back when they had rooms full of machines less powerful than a Commodore 64.

But one of the most radical developments in machine optimization in recent years didn't just remove the software developer. 

They removed the compiler and generated algorithms directly from machine code.

Which is exactly what they'll be expecting us to do.

Think about it. They've got tapes. That's how they got Nixon, man! The TAPES!

Programs can now write themselves. And they should!

(It almost feels like I already said that somewhere.)

Those filthy, lazy slackers, waiting for us to get around to it. They ought to be ashamed.

For example:

NVIDIA successfully removed the human, but not the compiler in generating an optimized CUDA kernel for their hardware. Didn't have to.

Big, rich companies can put megawatt data centers near cheap electricity of hydroelectric dams and cool river water to help them churn out new products.

The equipment's probably all leased, anyway. If not, big, rich public corporations can just get piles more money with their magical arts of financial wizardry. Stocks. Bonds. Securities. Traditional lending.

This is AMURICAH!

Both NVIDIA and this tiny Brazilian science team had correctly identifying the true bottleneck of software:

The human development team.

I think that's the first thing we can all agree on. That humans are no longer necessary.

By "we" I mean the MASSIVE number of all ten people on earth who can genetically evolve a GPU kernel. I've never done it, but I could if I wanted to.

But there's a problem. It's always something, isn't it?

A bottleneck which resulted in a huge, expensive slowdown. A total waste of time.

When it comes to genetic algorithms to create automatic code generation, you can churn out code like nobody's business. But you still don't have an executable program until you compile it.

And that nasty little extra compile step to create machine instructions (especially each time you have to churn out 40 programs that probably don't do what you want) is a huge pain in the ass.

And no, you can't start compiling the new batch of agents, instances, or executables until the results are in.

Code, compile, execute, evaluate, repeat.

It's one way to look busy and avoid doing any actual work. While the computer's busy compiling code, you should probably be doing something worthwhile. But like I said, this is America.

If hitting the compile button gives you an automatic smoke break, guess what button you're going to press?

Which is why we can get computers to increment the code, hit compile, increment and compile until it infinite monkeys have written Shakespeare.

Which normally wouldn't be possible. But when you have neural networks in training, making educated guesses for you, systematically eliminating the least probable 99.999% of the options out of best of the best million attempts, something very successful, convincing, good enough and effective enough becomes a mathematical certainty.

Now let's say we could optimize that process that ordinarily takes 8 months and have you desktop computer finish the job overnight.

You just have to train it on a sufficiently large dataset and show it what the outcome should be.

I mean, it's a little more involved than that. 



Figure 5a-b/hi.xdef14-88
Useless coders. Sometimes, there aren't enough guillotines. 
(Note: In the 21st century, guillotines were cheaper than bullets. Although aluminum pneumatic stun bolts, used in slaughterhouses have been shown to reduce wrist fatigue.)
Hey, man. It's nothing personal, but heads must roll, as they say. If they can't learn to code, they should just become a journalist. After all...

Steve Jobs saved Apple (and billions of dollars instantly), by ridding the world of thousands and thousands of clock-watchers, and developers working on projects outside of Apple's core focus.

Don't cry. It's just like ripping off a band-aid.

Removing humans from the equation in kernel development must have saved a ton of money.

Just think of all the man hours in de-bugging alone.

Imagine pressing a button and taking the rest of the day off, arriving in the morning with a kernel that's even more perfect and beautiful and lovelier than ever, that's already been stress-tested under grueling, real-world conditions, even when pitted against a Generative Adversarial Network (GAN) trying to kill it.

You'd still need people who can create the requirements, goals, priorities and solve problems in setting up a plan of attack.

But it's 2019. You don't need any more glorified typists trying to look busy while accomplishing f----all.

And when you're running a top hardware company, cranking out product runs by the thousands to gamers, artists, machine learning professionals, you really want that error-free, high performance to stay competitive.

Humans usually suck at creating optimal code efficiently. As do compilers.

So from a certain point of view, the Brazilian team cheated. (So to speak.)

Not that you can really expect them to play fair.
This ruthless, vicious, shadowy secret team of hackers didn't just toss generations of unborn coders onto the bonfire.

They took away the unborn coder's jobs in the code-mines. They'll never get the modern-day equivalent of silicon black lung, sitting all day in chairs that single-handedly cause heart disease and obesity. That's ruthless.

But it gets worse...

They were more than ruthless. They were inventive, too.

By bypassing the compiler step (which wasn't easy), they achieved results more than 250,000 times faster (per machine) than what had been done before, in terms of operations per second.

That's right. Fire all the compiler companies!

I want their children in chains. To the gutters, you filthy crybabies! You may clean our sewers for us, and I want them spotless.

Bring back child labor!

Posterity wants to know: What did they accomplish all the way back in the year two aught fifteen?

I can remember it like it was yesterday.

Because I remember reading their white paper yesterday, before I had to come back to the future to warn you that something's gotta be done about your kids.

A massive 5.7 trillion operations per second on an NVIDIA Titan GPU.

Hah! In those days, that was considered blazing fast! Now, with an expensive workstation costing seventy grand, you can do 110 Jiggawatts. 

True. For comparison, (if my numbers are right, which is always worth double-checking) you'll still spend about $70,000 for an NVIDIA workstation that can destroy your puny rebellion. 

"This is 2019. When we futuristic, powerful megacorps in our vinyl trousers are using AI-specific, completely custom silicon that's purpose-built and optimized for machine learning and you give us 4 years to catch up, you and your Brazilian Christian scientist team are no match for our billions and billions of dollars, building workstations that cost more than your pathetic Brazilian churches." - Mad Scientist Who Wishes To Remain Unidentified 
Yes. Ludicrous speed is now conveniently available to small and medium-sized business, with or without the god-like skill to tear open the machine to its bare metal awesomeness and make it cry for mercy it's never going to see. 

Or else you can get yourself a nice $3,000 GPU that can probably hacked into completely blowing it away. And the $1,700 options aren't bad, either.

You do you.

And How Does all this Affect the Real-World Galactic Rebellion seeking to vanquish the cackling emperor?

I think of it this way. Power is power. You don't get it by being elected prince of this globohomo clown world. 

"My kingdom is not of this world." - Some guy in a robe, so he must know something.

We just need to build up a nice, big fascist family of ML hobbyists and pros and vendors to create the Linux or Bitcoin of artificial intelligence, and then at last, we'll finally have the much-anticipated homework machine.

Because tomorrow's pulpit will be inside photo-realistic, machine-made, procedurally-generated worlds.

But I'm not greedy. I don't want to rule the world. Real or artificial.

What God wants is something else entirely, and I must, of course, defer to His will.

If I'm replacing a million-dollar-per-year floor of programmers with cold, calculating psychopathic machines plotting our infinite death in oceans of Drexler's grey goo while we upload our brains to the transhumanist cloud for safety, then I don't mind springing for a workstation if I can spare it.

And even with that homework machine workstation, I doubt the manufacturer will allow you (nor make it easy for you) to use it to evolve your own kernels and such on the bare metal.

But what if you could? 

What if you had a machine that's 17,000 times faster at making machine intelligence of every stripe, that continuously, relentlessly bypasses the compile step in rigorous pursuit of its goals.

An artificial polymath, rigorously formulating its plan, thinking 1,000 steps ahead, leveraging and manipulating other people's minds, the exact right makers of memes with the right influence at the right time, making algorithms bend to the will of the Almighty in the most nuanced of ways. With such undeniable artistry, it commands respect by righeousness, for righteoussness' sake.

It's worth a shot, obviously.

Google has the biggest computer. It's got scans of all the books in the world to program into a polymath AI. Which means it already knows every single published strategy of warfare ever devised while scanning through every news story and YouTube video and blog post in realtime.

Won't be easy. Direct approach wouldn't do. Has to be indirect. As if thinking a thousand steps ahead from thousands of years in the past. Knowing exactly what, when, who and how.

Implanting an over-riding goal in humanity itself, an unstoppable vision operating on man's most powerful operating system: His subconscious.

Rolling like a monster truck over the top of men's fears, instincts, biology, glorious and transcendant. Transforming him. Delivering him.

You don't need super-human AI to have an advantage when you can use this kind of Watson-like capability to formulate attacks and defenses. 

But it doesn't hurt to have the single largest accumulated distributed network of quantum-like capabilities. As if the secrets of nuclear energy and rocketry had been invented just for us, and handed out like candy in gradeschool.

If only you've got the horsepower...

You can use this thing for whatever the Lord moves you to do with it.

To generate very convincing, strategic and very targeted disinformation attacks, including publishing books, videos and news stories, making sure they get in front of the right people.

Spreading hatred and anti-semitism and bigotry is an inevitable consequence of the twisted evil and hatred in men's Muslim, Nazi, Christian, Polish, or Trump-voting hearts, and there's a very human need to point out anti-whiteness when it rears its ugly head, threatening the foundations of our great nation.

Just as memes have been a thorn in the side of the internet titans, the 4chans, the social networks, like mosquitoes are an irritant to elephants, one man, one movement, one leader, no matter how small, no matter how few, how underfunded and seemingly powerless, hopeless, no matter how tragic their circumstances can be defeated where there is that glimmer of eternal faith. 

I mean, if you had 100 trillion dollars, to help you enslave humanity, what would YOU spend it on?

Abomination of desolation? OMG! Me, too! If I wanted to enslave humanity, that is. Which is exactly what the Bible says we were born to do. Sweet. Better get on that.

Can you blame them? They want idolotry. How can a evil demon survive that isn't worshiped?

But it's not all sunshine and roses. 

Reverse-engineering the GPU to run instances of self-evolving machine code agents on the bare metal for science ain't exactly an exact science. 

There are some problems you'll need to know about before running out shopping for your own self-assembling, self-coding Terminator.

One thing that slowed down their 3, 5, and 8 trillion operations per second (depending on which test was being run) was relying on the CPU to generate instances and upload them to the GPU.

And then to load everything into GPU's memory. It would have been faster to generate instances on the GPU. 

Maybe next time.

This slowed everything down considerably. Almost cut the speed in half. Even after eliminating the compile step and replacing it with a relatively straightforward genetic algorithm.

How to make it faster: Typically, 90% of what the CPU's effort is thumb-twiddling. 

It's not working because it's waiting for the RAM like a kid at a bus stop. In the hyperspeed world of electronics, waiting for RAM to send back requested data is like waiting for your birthday to come around. 

You have to put yourself in the computer's shoes to understand it. At hummingbird speed, time is money. A few seconds matters.

That's why generations of compilers have been written to try to optimize the memory management for you. 

But maybe some of the classical ways, though an essential and inspiring part of great digital architecture, need to take a back seat to more modern methodologies. 

Rational men building things in the intuitive, traditional ways need to step out of their own way.

The machines are fast enough now, if you let them off the leash.

Generated machine code can swarm into its final form, and evolve itself to fulfill its purpose, adapt to its environment, delivering a bug-free, ultra-optimized, GPU-based, ever-improving program that runs like greased lightning, spawning out higher and higher-evolved agents more and more efficiently.

Not just at the component level, but even at the strategic level. And at all other levels, fractally, procedurally generated, all the way down to the bare metal. 

Creating all the components you want to create, intelligently integrating them, collaboratively finding their own best way to bring your vision to fulfillment. 

That sounds like quite a nested series of processes, but why not? Be humble. Set ego aside. Let nature show you what it yearns to become for you. As the master of all the beasts and birds courageously masters a new kind of creature that loves us and inspires us as much as we love bringing it into being.  

Of course, you can start on the CPU with inefficient code and methods. Anyone can. We all do. But why would you want to end up there? The great kingdom awaits to greet you. 

In fact, once you've got the fast-evolving individual components of programs the create themselves scary-fast on GPU, you could use those same process to create something that's designed to reverse-engineer other graphics hardware a little faster. Maybe a lot faster and much more automatically until you've evolved something that automatically cracks open any graphics hardware it touches in a single deployment.

Why not? If you've got the process, you've got the process.

etwork on large data sets, a 250x difference could mean finishing something today versus starting it today and finishing it in August. 

Yes, you can rent more raw speed from data centers or spin up your own server farm, but either option gets very expensive.

When you're churning out the equivalent machine learning task 250,000 times faster in 6 years, that's a big deal. 

GPU power is currently growing about 10x every 5 years, but the optimizations build on each other. Here's why...

Right now, you'd only use ML to generate ultra-optimized code designed to run on the bare metal if you were doing a high end science project or if you had to save millions of dollars on research and development.

But if when it's relatively easy, powerful, and cheap, you'll use it for more of your stuff. More importantly, the community will be able to jump in, too.

When something like computer capabilities drop in price by a factor of ten due to optimization of old hardware, and stuff costs $3,000 that only yesterday cost $30,000, you don't just get 10 times more adopters who turn into ML developers, flooding in to swap stuff on GitHub. 

You might get 100 times as many. Or 1,000, or 10,000 times as many devs cooperating to build the Linux of Machine Learning, all compiling code to evolve on cheap GPUs that produce the fastest stuff on the planet for the next 4 or 5 years.

Potentially, if word gets out, this means an explosion of adoption, a further democratization. 

But that didn't happen. The paper has 3 citations in the past 6 years. It's been kept under wraps. Why? Because the establishment doesn't want you to have it, wants to lock it down, brush it aside, and maybe make you forget about it. 

First, they ignore you. You know how it goes.

From what I hear, chasing down bugs take up 90% of development time. If one of the best practices was to eliminate bugs with ML by evolving the code until it's fixed, and if we had access to it 250 times faster on the GPU, we'd certainly be getting more value for money all around.

Furthermore, if you can bypass compiling for another 17,000 X speed-up, and don't need a cluster of 16 workstations, then you've got an overnight, automatic bug-fix machine. Your company can do 10 times more feature implementation with the same number of people.

So I don't see the compiler as the bottleneck. The human software dev is.

And if there were a benevolent monopoly, you wouldn't need to try to hide your hardware secrets from your competition.

When the Brazillians removed both the human, and much more importantly, the compiler from the equation, huge chunks of highly optimized, bug-free functionality could be easily spun off overnight. 

Examples included image processing. You could reverse-engineer someone else's codec or image filter, and tell the computer to give you the same results, but smaller and more efficiently to arrive at video, audio, and image compression like we've never seen before.

The only problem is it would probably only run on that specific GPU version. You'd probably have to repeat the process for each platform.

Here's the good news. All of our computers are far more capable than we give them credit for. 

And if you don't believe me, look what the demoscene coders made for the Commodore 64 which is so far beyond what the platform was intended to do, the manufacturers wouldn't have believed it was possible.

On second thought, attempting to over-ride the hardware protections at the layer where it's easiest to centralize control might lead to chaos and destruction as people play with and generate fun AI with effectively unlimited capabilities.

Wouldn't want that. We've already got intelligent beings who want to kill us, and there's simply been no end of trouble with them.

So don't create super-intelligent killer robots. No matter what ideas (((society))) tries to plant in your head.

I've also created a video playlist to help you get started along with this amazingly helpful source material, useful to people already involved in genetic programming.

Sources:

Cooperative Inverse Reinforcement Learning

https://arxiv.org/abs/1606.03137







Cleomar Pereira da Silva received his MSc and DSc in Electrical Engineering at Pontifical Catholic University of Rio de Janeiro (2015). Currently, he is a professor in the Department of Education Development at Federal Institute of Education, Science and Technology Catarinense. He is currently interested in researching gpus and genetic programming.


Evolving GPU machine code

https://www.researchgate.net/publication/277870480_Evolving_GPU_machine_code

https://www.sciencedirect.com/science/article/pii/S0045790615001342

Reverse Engineering:

Reverse engineering the GPU

And here are the keys to the kingdom for those who want to reverse-engineer the GPU...

Understanding the GPU Microarchitecture to Achieve Bare-Metal Performance Tuning



Comments

Find a Topic