Why do I not use a Virtual Machine from my local box?August 5th, 2020
The first place I worked, iSTORM didn't use virtual machines. They had a staging server and a production server. We developed on the staging server and used CVS to synchronize code into production.
Later on in my career there was a project where part of the consistent environment setup is everybody using the same virtual machine builds which mimic the online staging environments. At first I questioned it. Isn't it better to just do it like I always did where there is one communal environment?
I thought I would give Virtual Machines a chance. After all, this is a bit of a big firm and as things grow, other steps need to get put in place to ensure things go smoothly.
Basically, there was some file character limit that Apache defaulting was writing via Ubuntu when running under VirtualBox. The fix was adding this directive to the httpd.conf:
Ok. So I figured it out. And let the rest of the team know. I don't recall if it impaced anybody other than me as I was the only one working from a Linux box using a VM. But it took about a day for all devs to get their VMs setup.
It was 2 Devs +1 Architect. So 3 employees had to use a combined 24 hours of work to make their environments consistent.
Compared to setting up a dev server. It would have taken much less time for me to setup a server to test. Actually, since we needed to setup a production server anyway, getting that ready and working backwards by cloning the production server and altering the config would have taken mere minutes.
The setup I have now doesn't use Virtual Machines to test on. We have 2 AWS EC2 servers and an RDS database instance.
The production server is the stable customer facing environment.
But the dev server does everything else. Test new system dependencies.
How do you manage system dependencies better on VM's? You don't. If I decide I need some mail encryption that has a cipher which needs to be installed, I would install it on my virtual machine and then I would have to get the other 2 to spend thrice the amount of time to install it on their VMs as well. Then I would have to go onto the QA servers (there were 2 AWS3 EC2 QA instances) + production. In total making it 3 + 2 + 1 = 6 servers installed with Virtual Machines.
Instead, with a single dev server, you can have CPU intense crons running, a single environment to test new dependencies on. And everybody inherits the new dependencies at the same time. Which isn't always a good thing if there are complications. But considering the additional code downtime which would otherwise be caused by developers installing dependencies themselves, I think you could lose upwards of 8 hours per developer using VMs. As long as we keep the downtime less than that per developer we are coming out a head. To minimize downtime further, a developer could do installations off-hours and/or on another cloned instance to test.
Using VMs adds more points of failure than necessary.
The dev server can be easily initiated to the point where root could easily setup a new environment by taking in 1 parameter (username) and allowing the new employee to enter password when prompted with a < 10 line shell script.
Less Installations of Software Minimizes Dev Onboarding Time
By not installing VMs on every Dev's computer, on-boarding a PHP Slim developer could take the time it takes to enter a username and password before they could start implementing Controllers and Services. Assuming the Developer has an IDE or is solid with vi people can run composer and get straight to work.
Software Developer always striving to be better. Learn from others' mistakes, learn by doing, fail fast, maximize productivity, and really think hard about good defaults. Computer developers have the power to add an entire infinite dimension with a single Int (or maybe BigInt). The least we can do with that power is be creative.