Magento 2 code quality tools

If you are planning to publish an extension to Magento’s marketplace, it will have to pass a few quality gates before getting published. Magento did provide some tools to allow you testing locally before submitting the extension.

First, there is https://github.com/magento/magento-coding-standard . While this tool is mostly centered around code formatting, it can catch some nastier stuff too, like loading a model in a loop or using deprecated libraries. Assuming that you will be testing your extension in the context of a Magento website, you first need to add to your composer.json file, somewhere at root level:

"scripts": {
    "post-install-cmd": [
        "([ $COMPOSER_DEV_MODE -eq 0 ] || vendor/bin/phpcs --config-set installed_paths ../../magento/magento-coding-standard/)"
    ],
    "post-update-cmd": [
        "([ $COMPOSER_DEV_MODE -eq 0 ] || vendor/bin/phpcs --config-set installed_paths ../../magento/magento-coding-standard/)"
    ]
}

Then install the coding standards package:

composer require --dev magento/magento-coding-standard

Then run the checker:

vendor/bin/phpcs --standard=Magento2 app/code/My/Extension

This will provide an output like:

FOUND 0 ERRORS AND 16 WARNINGS AFFECTING 12 LINES
--------------------------------------------------------------------------------------------------------------------------------
  7 | WARNING | [x] Opening brace of a class must be on the line after the definition
 11 | WARNING | [x] The first parameter of a multi-line function declaration must be on the line after the opening bracket
 12 | WARNING | [x] Multi-line function declaration not indented correctly; expected 8 spaces but found 32
 13 | WARNING | [x] Multi-line function declaration not indented correctly; expected 8 spaces but found 32
 13 | WARNING | [x] The closing parenthesis of a multi-line function declaration must be on a new line
 14 | WARNING | [x] The closing parenthesis and the opening brace of a multi-line function declaration must be on the same line
 45 | WARNING | [ ] Expected "if (...) {\n"; found "if(...) {\n"
 45 | WARNING | [x] Expected 1 space(s) after IF keyword; 0 found
 45 | WARNING | [x] No space found after comma in function call
 47 | WARNING | [x] Expected 1 space after closing brace; newline found
 48 | WARNING | [ ] Expected "} else {\n"; found "}\n        else {\n"
 55 | WARNING | [ ] Line exceeds 120 characters; contains 125 characters
 56 | WARNING | [x] No space found after comma in function call
 56 | WARNING | [x] No space found after comma in function call
 59 | WARNING | [ ] Code must not contain multiple empty lines in a row; found 2 empty lines.
 64 | WARNING | [x] Expected 1 newline at end of file; 0 found
--------------------------------------------------------------------------------------------------------------------------------
PHPCBF CAN FIX THE 12 MARKED SNIFF VIOLATIONS AUTOMATICALLY
--------------------------------------------------------------------------------------------------------------------------------

You can fix some of the issues automatically:

vendor/bin/phpcbf --standard=Magento2 app/code/My/Extension

Finally, Magento also offers https://github.com/magento/marketplace-tools . This is not a quality tool per se, just a little helper to check that the archive you are about to upload is correct. Use it like:

./validate_m2_package.php app/code/My/Extension/Extension.zip

While Magento 2 does more checks against your extension so it can still get rejected, the above should catch the obvious issues before starting the submission process. I also find them useful to quickly check the quality of a 3rd party extension.

 

NodeJS – doing external calls in parallel

I am still getting my head around Node’s async model. The specific problem that I was tackling was calling a paged API endpoint. In most programming languages, you’d call the first page, wait to get the data, then call the second and so on. It turns out that there is a better way todo this in Node. Start by defining a few initial variables :

const fetch = require("node-fetch");
let currentPage = 1;
const queryUrl = 'http://local.dev/api.php';
let data = [];

Then, create an array of all the urls you want to call. Well, this means you need to know the number of pages in advance, which can be done by doing a single call first. For simplicity I assume I know the number of pages:

let urls = [];
while(currentPage <= 5) {
  urls.push(queryUrl+'?page='+currentPage);
  currentPage++;
}

Now, just do the calls. Since we are getting back promises, this code will not wait for each individual call to finish:

const promises = urls.map(async url => {
  try {
    console.log("Querying "+url)
    const response = await fetch(url);
    return response.json();
  }
  catch(err) {
    console.log(err)
  }
});

Finally, assemble back the data returned by each promise:

for (const promise of promises) {
  data = data.concat(await promise);
}

The data array would contain all the info, also sorted in the correct way no matter if the API calls responded in a different order. Neat.

On compromise

As software developers, we often need to compromise. Budgets go short, requirements change, launch dates must be honored. Everyone in the field for more than a couple of years has plenty of examples of rushing through a project, cutting corners, skipping testing and generally delivering a suboptimal solution to hit the time or budget constraints.

But how much compromise is enough? Hard to tell as software development is not that regulated. In other fields, there is a term called “malpractice”, which means that the professional (and not necessarily the company that hires them), is legally responsible for certain types of mistakes. This ensures for example that no surgeon does half the surgery because the manager wanted it done quicker. They are liable for their work, finger pointing will not make the consequences go away. Now, luckily as an e-commerce developer, I cannot kill anyone with my mistakes. But I can lose millions of dollars. I can even make a company go under.

That’s why a while ago, I have decided that there are some lines I will not cross no matter the pressure. I would go as far as losing the job/project instead of compromising. But let me go back a bit. The first thing you need to understand as a professional developer is your role as a professional. Basically you are responsible for:

  • Defining the architecture. The architecture must work. There is no point of proposing something that’s flawed from the start. A cheap car that does not work does not have any value.
  • Estimating. Project Managers or CEOs have wishes on when to launch, but you are responsible for telling them if it can be done. It might not and that’s ok. You are also responsible of NOT estimating if you cannot do it. When you go a physician with an illness that does not have a definite timespan, the physician will not be promising you will be cured in one month. They will tell you to follow a treatment and check in from time to time. It will be done when it will be done. Worst words a project manager wants to hear, but sometimes that’s the truth.
  • Let everyone know if you discover you were wrong. That’s a hard discussion to have, but keep in mind that everyone’s goal is a working solution. There might be consequences for you, but remember you’re the professional and must act accordingly.

Now, back to compromises. It’s hard to tell what is something that really puts the project at risk and what’s not a big deal. Especially under pressure. Personally, I have compiled a list of things that I am not compromising on:

  • Always reproduce the bug first. This might be a very complex, but if I cannot do it, how can I check my solution?
  • Always test all use-cases myself. While one might decide that QA by the dedicated team can be skipped, I am never skipping my own testing.
  • Always take a moment to think through the security implications. Never leave a security flaw in place with the note to fix it later.
  • Never cut a corner that I know I cannot uncut. It’s ok to write messier code under pressure, but only if there really is a way to go back and fix it.

I guess it ultimately boils down to being ok with the solution you provide. Not thrilled by it, not even happy, but at least ok with it. If a developer sees the code a few months later, he should be able to say that the solution is good enough.

There are of course a lot of other things I can compromise on:

  • putting in extra hours to see the project through.
  • implement a suboptimal solution knowing that I can go back and fix it later. Of course, with the client’s approval/understanding of the compromise.
  • hard coding, code duplication, coding standards, not-so-performant solutions and everything else related to code quality, as long as the solution is still ok-ish.

Even the above compromises do not play well long-term. While they will not do major damage at any specific point in time, they add tech debt that makes the project harder and harder to work on. Each change becomes more expensive and error-prone. If the client is always in rush mode it’s ok to warn them a few times and at some point, look for another project/job. Leaving a project is never easy, but I prefer that to knowing I was the guy that slowly pushed it over the point of no return.

A case against one step checkouts

Magento 2 provides a versatile checkout process, mainly consisting of two steps – shipping and billing. For whatever reasons, a fair amount of merchants are trying to get away from it and use a one step checkout –  a checkout that combines both steps into one. The reasoning seems to be that presenting all information at once makes it easier to checkout.

However, I have seen a lot of merchants that invested in such a checkout, only to revert back to the original one after a while. I think there are more reasons that contribute to this.

Usability

  • combining more steps into one implies more fields on the same screen. A wizard-like approach is cleaner.
  • the customer still needs to fill in the same amount of fields.
  • unexpected changes in fields. One might fill in the card details, then change the shipping address to a region where you don’t take cards. Their payment info must be re-entered.

Implementation

  • A lot more ajax requests. While this can be mitigated by a proper implementation, but that’s not always the case.

Maintenance

  • You open up the way to a lot more ways one can checkout, making testing more difficult.
  • All one step checkout extensions are a bit heavy, Magento upgrades become harder.
  • Other checkout-related extensions might not work without even more changes.

Unknown

  • There must be a reason why Magento (and even Shopify), ship with a multi-step checkout. I am not that familiar with the research that led them on this path, but I assume it was not accidental.

On the other hand, I am curious on whether you have more information on when a one step checkout is a good solution for Magento.

Logging sftp activity

Logging SFTP activity can be done (on most Linux systems) by editing /etc/ssh/sshd_config . Simply find:

Subsystem sftp /usr/libexec/openssh/sftp-server

And change to:

Subsystem sftp /usr/libexec/openssh/sftp-server -l INFO

Then restart the ssh daemon:

systemctl restart sshd

The info log level is just one of many, there are others, like VERBOSE, DEBUG etc, but usually INFO is a good compromise. To see the logs simply tail /var/log/messages:

tail -f /var/log/messages | grep /the/directory/i/care/about

Composer artifacts – an easy way to include composer packages with no repos in your code

As a Magento developer, I always prefer to add 3rd party extensions using composer. Sadly, a fair amount of vendors still provide archives with their modules instead of composer-ready repositories. Creating separate private repositories to keep each extension and then including them in my project seems like an overkill and it looks that there is a better solution for this use case – composer artifacts.

The idea is pretty simple – create an archive with your composer-ready code, add it in a directory of your main project, then simply ask composer to use it.

As an example, let’s assume we have a fictional module that has the following composer.json file:

{
"name": "paul/artifact",
"description": "this is an artifact",
"minimum-stability": "stable",
"version": "1.0",
"license": "proprietary",
"authors": [
  {
    "name": "test",
    "email": "email@example.com"
  }
]
}

The only part to keep in mind is the name of the package, paul/artifact in this case. To use it, create a zip archive of the code (including the composer.json file) then add it to your project in a subdirectory, i.e. artifacts/. The name of the zip archive is irrelevant.

In your main project, you can make composer aware of the artifacts directory by adding the following new repository type:

"repositories": [
    .....
    {
      "type": "artifact",
      "url": "artifacts"
    }
  ],

 
The artifact type is what will make composer search for artifacts as zip archives as opposed to pulling from source control repos. The “url” is the name of the directory where the zip archives are, relative to your project root.

Once this is done, you simply require the package as you always do:

"require": {
    "paul/artifact": "1.*"
  }

This brings up all composer’s features – it will install dependencies, lock versions etc. There is virtually no difference between an artifact package and any other type of package.

A small gotcha – if you want to update the package (say you zipped up a new version), don’t forget to clear composer’s cache (by running composer cleancache). Composer caches the artifacts exactly as it caches remote repos, so if you are not changing version numbers you have to intentionally clean caches so that your new version is picked up.

Hope this saves you some time.

The current state of the Magento Marketplace

One of the best reasons to use Magento are the community extensions. Free or paid, they augment Magento’s core features and save a lot of development time. Magento always supported this effort by offering a curated extension list. In the old days, this was called Magento Connect. It was simply an extension directory, Magento did not intermediate support or purchasing the extension. It still had value though as it did include a review list which of course, was more relevant than the 5-star reviews on the vendor’s site.

A short history of the marketplace

Magento Connect had a lot of down sides though. The approval time was very long. All the focus was on the marketing. You had to have pretty images and descriptions, but you could get away with very low quality, or even broken code. What was the worst though is that Magento heavily promoted the idea that you can go to the marketplace, get an extension, upload it via SFTP to your server and use it. Magento was not (and it isn’t to date), that kind of system. This resulted in a large number of bad reviews from non-technical merchants (“This extension breaks your site”, “I am not able to upload, PLS HELP!”, “Not good! Do not USE!”). Magento’s approach frustrated the open source community. It’s one thing to charge for the extension and provide support to merchants that cannot afford a developer. A totally different thing to provide free software, but having to deal with installing the extension/fix incompatible merchant stores. This resulted in a large number of open source developers simply pulling off their extensions from Magento Connect and keeping them on Github only, where the audience is more technical. I found myself searching for an extension on Github first and on connect second, which defeated the whole purpose of the effort to have an extension directory.

Fast forward to Magento 2, the approach was changed completely and Magento Connect was replaced by Magento Marketplace. There were major improvements right from the start:

  • Magento intermediates the purchasing (and charges a fee to the extension vendor). Now I can at least address Magento for a refund instead of obscure vendors.
  • You can no longer post a review unless you actually purchased the extension.
  • Better experience overall (more relevant searches, more relevant extension description pages to name a few).

What did not improve from the start was the extension quality. Actually, initially the quality was worse than on the old Magento Connect. Probably Magento needed to have a big number of extensions to offer, so they accepted pretty much anything (as long as you had pretty pictures and descriptions!). Vendors tried to release their Magento 1 extensions for Magento 2 asap, ignoring all coding standards and architectural approaches.

Luckily, Magento worked hard and improved this. Here is how the current process looks like:

Screen Shot 2018-11-04 at 11.32.32 AM

First, there are now two parallel, completely separate tracks – Marketing and Technical.

Marketing

The marketing flow is about the pretty pictures and description. Magento actually made this part really useful…

  • you need to add screenshots. Really important as they are the easiest way to understand quickly if the extension does what you need it todo.
  • the description has stricter guidelines so that it’s focused on the feature set, not on the sales pitch.
  • you have to submit a installation and user manual.
  • you have to clearly specify Magento version compatibility and pricing, including whether the extension relies on another third party subscriptions.

Technical

This is very important for a developer/agency. Personally, I try my best to write good quality code in my projects. Then the merchant buys an extension from the official marketplace and I am puzzled at how low quality it is. Or at least, this is how it used to work. Now there is a technical review. Mainly, it has three steps:

  • an automated code sniffer. It catches quite a few things. It even caught a few things in my code even I consider myself “seasoned”. While it’s still just an automated checker, you cannot do blatant mistakes anymore.
  • a Varnish test. Basically, check that the extension does not break caching. I had to ask for refunds on extensions in the past due to their architecture simply disabling caching and relying on it.
  • a manual QA test. While I am not sure what happens there, I like to think that a real person actually checks the basic flows of the extension and maybe looks over the code too.

I am sure the above works. First, there is no way to bypass the review process. If the automated code tester finds issues, you have to fix them. Then, I can simply feel how the quality has increased. It’s becoming the exception that I buy an extension and have a bad experience. Actually, I am currently only using the marketplace to buy extensions as I trust Magento’s reviews. At least for me, the Magento-owned marketplace concept finally worked.

Why is it better

Besides the above, there are a few not-so-obvious improvements that really helped:

  • the manual review team seems a bit more trained. I did not get my extensions rejected for silly reasons in a while.
  • the review process is faster. No more waiting for months.
  • the documentation is better on how the submission should look like, at least on the marketing side.

What’s still missing

While the extension markeplace is better, it’s still a long way from great imho. Here is what I’d like to see in the future:

  • A Magento-owned demo site so I can check the extension before buying. The vendors now take care of the demos, but not all of them do it properly.
  • A git repo for the extension. Being able to see the commit history helps me a lot.
  • Magento should own security issues. Sadly, vendors do a poor job at communicating a security issue. I’d like to be able to submit a form to Magento when I have one, Magento should work with vendor to correct, then all extension owners should be notified. This is left at the discretion of the vendor now. Most of them simply release a new version but forgot about the patch, or even about mentioning the upgrade fixes a critical security issue.
  • As an extension vendor, I’d love to see subscription-based pricing.

Conclusion

In the last year, I started to trust the marketplace as an authoritative entity in the extension world. While there are a few things to improve, Magento is definitely moving in the right direction. I expect that by then end of 2019, we will have an even better marketplace.