Decent WordPress workflows with WPEngine

Since starting doing WordPress, I have been struggling a bit to make the dev workflow a bit more predictable. The WordPress world does not seem to be very cautious in general, I have seen quite a few projects where the code was not versioned at all and the updates were done via the admin, directly in production. Luckly, it seems the the folks at WPEngine try to cater for the projects where a more structured matters.

First of all, every WP Engine hosted website provide the option to add a dev and even a stage website – https://wpengine.com/support/environments/ . Even better, they offer an option to sync back the data between the envs, i.e. you can pull the production database back to staging anytime it lags behind. This alone is pretty cool, it means that with a few clicks, you are able to have a “backup” where you or the client can test their new ideas without creating a prod outage.

Moving on, you can “tie” each env to a git repo – https://wpengine.com/support/git/ . The process is pretty simple – add your public key, then get a full featured repo for each environment. Whenever you push a branch, the content will be visible on the env that’s associated with the repo. Pretty fast too, it only adds a few seconds for the build when pushing. If you use multiple branches, the last branch you push is the one that reflects the code changes. So you can work with feature branches and any time you switch to a new one it will take over.

One thing that I never liked about WordPress is that it gets complicated to not version the core. Since there is no composer, you have to create a monstrous gitignore to exclude the core, which is all over the place. However, WPEngine makes it pretty easy:

  • You use their gitignore to exclude anything that’s core related from your git repo.
  • You commit your own modules.
  • When the time comes for the upgrades, you can do them locally and test. Because of the gitignore, there is nothing that needs to be commited.
  • You use their own admin panel todo a one click upgrade to WordPress.

While I think their own WP version is modified a bit for some extra features, I never ran into an issue where it worked on my local but not on their hosting. And anyway, you can do the process on the dev env first just to be sure.

Finally, a pleasant surprise is that everything is quick. Builds take a few seconds. Upgrades less than one minute. And most of the time, it just works. When it did not, their supported responded on chat in a few minutes and helped almost real time.

Of course, if you don’t care about dev envs/git repos you can just ignore all those WPEngine features and keep working directly in production. They did a good job of making all the steps a choice, not a requirement.

I’m glad services like WPEngine exist, they can really help in having a structured deploy process in just a few minutes.

Magento 2 code quality tools

If you are planning to publish an extension to Magento’s marketplace, it will have to pass a few quality gates before getting published. Magento did provide some tools to allow you testing locally before submitting the extension.

First, there is https://github.com/magento/magento-coding-standard . While this tool is mostly centered around code formatting, it can catch some nastier stuff too, like loading a model in a loop or using deprecated libraries. Assuming that you will be testing your extension in the context of a Magento website, you first need to add to your composer.json file, somewhere at root level:

"scripts": {
    "post-install-cmd": [
        "([ $COMPOSER_DEV_MODE -eq 0 ] || vendor/bin/phpcs --config-set installed_paths ../../magento/magento-coding-standard/)"
    ],
    "post-update-cmd": [
        "([ $COMPOSER_DEV_MODE -eq 0 ] || vendor/bin/phpcs --config-set installed_paths ../../magento/magento-coding-standard/)"
    ]
}

Then install the coding standards package:

composer require --dev magento/magento-coding-standard

Then run the checker:

vendor/bin/phpcs --standard=Magento2 app/code/My/Extension

This will provide an output like:

FOUND 0 ERRORS AND 16 WARNINGS AFFECTING 12 LINES
--------------------------------------------------------------------------------------------------------------------------------
  7 | WARNING | [x] Opening brace of a class must be on the line after the definition
 11 | WARNING | [x] The first parameter of a multi-line function declaration must be on the line after the opening bracket
 12 | WARNING | [x] Multi-line function declaration not indented correctly; expected 8 spaces but found 32
 13 | WARNING | [x] Multi-line function declaration not indented correctly; expected 8 spaces but found 32
 13 | WARNING | [x] The closing parenthesis of a multi-line function declaration must be on a new line
 14 | WARNING | [x] The closing parenthesis and the opening brace of a multi-line function declaration must be on the same line
 45 | WARNING | [ ] Expected "if (...) {\n"; found "if(...) {\n"
 45 | WARNING | [x] Expected 1 space(s) after IF keyword; 0 found
 45 | WARNING | [x] No space found after comma in function call
 47 | WARNING | [x] Expected 1 space after closing brace; newline found
 48 | WARNING | [ ] Expected "} else {\n"; found "}\n        else {\n"
 55 | WARNING | [ ] Line exceeds 120 characters; contains 125 characters
 56 | WARNING | [x] No space found after comma in function call
 56 | WARNING | [x] No space found after comma in function call
 59 | WARNING | [ ] Code must not contain multiple empty lines in a row; found 2 empty lines.
 64 | WARNING | [x] Expected 1 newline at end of file; 0 found
--------------------------------------------------------------------------------------------------------------------------------
PHPCBF CAN FIX THE 12 MARKED SNIFF VIOLATIONS AUTOMATICALLY
--------------------------------------------------------------------------------------------------------------------------------

You can fix some of the issues automatically:

vendor/bin/phpcbf --standard=Magento2 app/code/My/Extension

Finally, Magento also offers https://github.com/magento/marketplace-tools . This is not a quality tool per se, just a little helper to check that the archive you are about to upload is correct. Use it like:

./validate_m2_package.php app/code/My/Extension/Extension.zip

While Magento 2 does more checks against your extension so it can still get rejected, the above should catch the obvious issues before starting the submission process. I also find them useful to quickly check the quality of a 3rd party extension.

 

NodeJS – doing external calls in parallel

I am still getting my head around Node’s async model. The specific problem that I was tackling was calling a paged API endpoint. In most programming languages, you’d call the first page, wait to get the data, then call the second and so on. It turns out that there is a better way todo this in Node. Start by defining a few initial variables :

const fetch = require("node-fetch");
let currentPage = 1;
const queryUrl = 'http://local.dev/api.php';
let data = [];

Then, create an array of all the urls you want to call. Well, this means you need to know the number of pages in advance, which can be done by doing a single call first. For simplicity I assume I know the number of pages:

let urls = [];
while(currentPage <= 5) {
  urls.push(queryUrl+'?page='+currentPage);
  currentPage++;
}

Now, just do the calls. Since we are getting back promises, this code will not wait for each individual call to finish:

const promises = urls.map(async url => {
  try {
    console.log("Querying "+url)
    const response = await fetch(url);
    return response.json();
  }
  catch(err) {
    console.log(err)
  }
});

Finally, assemble back the data returned by each promise:

for (const promise of promises) {
  data = data.concat(await promise);
}

The data array would contain all the info, also sorted in the correct way no matter if the API calls responded in a different order. Neat.

On compromise

As software developers, we often need to compromise. Budgets go short, requirements change, launch dates must be honored. Everyone in the field for more than a couple of years has plenty of examples of rushing through a project, cutting corners, skipping testing and generally delivering a suboptimal solution to hit the time or budget constraints.

But how much compromise is enough? Hard to tell as software development is not that regulated. In other fields, there is a term called “malpractice”, which means that the professional (and not necessarily the company that hires them), is legally responsible for certain types of mistakes. This ensures for example that no surgeon does half the surgery because the manager wanted it done quicker. They are liable for their work, finger pointing will not make the consequences go away. Now, luckily as an e-commerce developer, I cannot kill anyone with my mistakes. But I can lose millions of dollars. I can even make a company go under.

That’s why a while ago, I have decided that there are some lines I will not cross no matter the pressure. I would go as far as losing the job/project instead of compromising. But let me go back a bit. The first thing you need to understand as a professional developer is your role as a professional. Basically you are responsible for:

  • Defining the architecture. The architecture must work. There is no point of proposing something that’s flawed from the start. A cheap car that does not work does not have any value.
  • Estimating. Project Managers or CEOs have wishes on when to launch, but you are responsible for telling them if it can be done. It might not and that’s ok. You are also responsible of NOT estimating if you cannot do it. When you go a physician with an illness that does not have a definite timespan, the physician will not be promising you will be cured in one month. They will tell you to follow a treatment and check in from time to time. It will be done when it will be done. Worst words a project manager wants to hear, but sometimes that’s the truth.
  • Let everyone know if you discover you were wrong. That’s a hard discussion to have, but keep in mind that everyone’s goal is a working solution. There might be consequences for you, but remember you’re the professional and must act accordingly.

Now, back to compromises. It’s hard to tell what is something that really puts the project at risk and what’s not a big deal. Especially under pressure. Personally, I have compiled a list of things that I am not compromising on:

  • Always reproduce the bug first. This might be a very complex, but if I cannot do it, how can I check my solution?
  • Always test all use-cases myself. While one might decide that QA by the dedicated team can be skipped, I am never skipping my own testing.
  • Always take a moment to think through the security implications. Never leave a security flaw in place with the note to fix it later.
  • Never cut a corner that I know I cannot uncut. It’s ok to write messier code under pressure, but only if there really is a way to go back and fix it.

I guess it ultimately boils down to being ok with the solution you provide. Not thrilled by it, not even happy, but at least ok with it. If a developer sees the code a few months later, he should be able to say that the solution is good enough.

There are of course a lot of other things I can compromise on:

  • putting in extra hours to see the project through.
  • implement a suboptimal solution knowing that I can go back and fix it later. Of course, with the client’s approval/understanding of the compromise.
  • hard coding, code duplication, coding standards, not-so-performant solutions and everything else related to code quality, as long as the solution is still ok-ish.

Even the above compromises do not play well long-term. While they will not do major damage at any specific point in time, they add tech debt that makes the project harder and harder to work on. Each change becomes more expensive and error-prone. If the client is always in rush mode it’s ok to warn them a few times and at some point, look for another project/job. Leaving a project is never easy, but I prefer that to knowing I was the guy that slowly pushed it over the point of no return.

A case against one step checkouts

Magento 2 provides a versatile checkout process, mainly consisting of two steps – shipping and billing. For whatever reasons, a fair amount of merchants are trying to get away from it and use a one step checkout –  a checkout that combines both steps into one. The reasoning seems to be that presenting all information at once makes it easier to checkout.

However, I have seen a lot of merchants that invested in such a checkout, only to revert back to the original one after a while. I think there are more reasons that contribute to this.

Usability

  • combining more steps into one implies more fields on the same screen. A wizard-like approach is cleaner.
  • the customer still needs to fill in the same amount of fields.
  • unexpected changes in fields. One might fill in the card details, then change the shipping address to a region where you don’t take cards. Their payment info must be re-entered.

Implementation

  • A lot more ajax requests. While this can be mitigated by a proper implementation, but that’s not always the case.

Maintenance

  • You open up the way to a lot more ways one can checkout, making testing more difficult.
  • All one step checkout extensions are a bit heavy, Magento upgrades become harder.
  • Other checkout-related extensions might not work without even more changes.

Unknown

  • There must be a reason why Magento (and even Shopify), ship with a multi-step checkout. I am not that familiar with the research that led them on this path, but I assume it was not accidental.

On the other hand, I am curious on whether you have more information on when a one step checkout is a good solution for Magento.

Logging sftp activity

Logging SFTP activity can be done (on most Linux systems) by editing /etc/ssh/sshd_config . Simply find:

Subsystem sftp /usr/libexec/openssh/sftp-server

And change to:

Subsystem sftp /usr/libexec/openssh/sftp-server -l INFO

Then restart the ssh daemon:

systemctl restart sshd

The info log level is just one of many, there are others, like VERBOSE, DEBUG etc, but usually INFO is a good compromise. To see the logs simply tail /var/log/messages:

tail -f /var/log/messages | grep /the/directory/i/care/about

Composer artifacts – an easy way to include composer packages with no repos in your code

As a Magento developer, I always prefer to add 3rd party extensions using composer. Sadly, a fair amount of vendors still provide archives with their modules instead of composer-ready repositories. Creating separate private repositories to keep each extension and then including them in my project seems like an overkill and it looks that there is a better solution for this use case – composer artifacts.

The idea is pretty simple – create an archive with your composer-ready code, add it in a directory of your main project, then simply ask composer to use it.

As an example, let’s assume we have a fictional module that has the following composer.json file:

{
"name": "paul/artifact",
"description": "this is an artifact",
"minimum-stability": "stable",
"version": "1.0",
"license": "proprietary",
"authors": [
  {
    "name": "test",
    "email": "email@example.com"
  }
]
}

The only part to keep in mind is the name of the package, paul/artifact in this case. To use it, create a zip archive of the code (including the composer.json file) then add it to your project in a subdirectory, i.e. artifacts/. The name of the zip archive is irrelevant.

In your main project, you can make composer aware of the artifacts directory by adding the following new repository type:

"repositories": [
    .....
    {
      "type": "artifact",
      "url": "artifacts"
    }
  ],

 
The artifact type is what will make composer search for artifacts as zip archives as opposed to pulling from source control repos. The “url” is the name of the directory where the zip archives are, relative to your project root.

Once this is done, you simply require the package as you always do:

"require": {
    "paul/artifact": "1.*"
  }

This brings up all composer’s features – it will install dependencies, lock versions etc. There is virtually no difference between an artifact package and any other type of package.

A small gotcha – if you want to update the package (say you zipped up a new version), don’t forget to clear composer’s cache (by running composer cleancache). Composer caches the artifacts exactly as it caches remote repos, so if you are not changing version numbers you have to intentionally clean caches so that your new version is picked up.

Hope this saves you some time.