On compromise

As software developers, we often need to compromise. Budgets go short, requirements change, launch dates must be honored. Everyone in the field for more than a couple of years has plenty of examples of rushing through a project, cutting corners, skipping testing and generally delivering a suboptimal solution to hit the time or budget constraints.

But how much compromise is enough? Hard to tell as software development is not that regulated. In other fields, there is a term called “malpractice”, which means that the professional (and not necessarily the company that hires them), is legally responsible for certain types of mistakes. This ensures for example that no surgeon does half the surgery because the manager wanted it done quicker. They are liable for their work, finger pointing will not make the consequences go away. Now, luckily as an e-commerce developer, I cannot kill anyone with my mistakes. But I can lose millions of dollars. I can even make a company go under.

That’s why a while ago, I have decided that there are some lines I will not cross no matter the pressure. I would go as far as losing the job/project instead of compromising. But let me go back a bit. The first thing you need to understand as a professional developer is your role as a professional. Basically you are responsible for:

  • Defining the architecture. The architecture must work. There is no point of proposing something that’s flawed from the start. A cheap car that does not work does not have any value.
  • Estimating. Project Managers or CEOs have wishes on when to launch, but you are responsible for telling them if it can be done. It might not and that’s ok. You are also responsible of NOT estimating if you cannot do it. When you go a physician with an illness that does not have a definite timespan, the physician will not be promising you will be cured in one month. They will tell you to follow a treatment and check in from time to time. It will be done when it will be done. Worst words a project manager wants to hear, but sometimes that’s the truth.
  • Let everyone know if you discover you were wrong. That’s a hard discussion to have, but keep in mind that everyone’s goal is a working solution. There might be consequences for you, but remember you’re the professional and must act accordingly.

Now, back to compromises. It’s hard to tell what is something that really puts the project at risk and what’s not a big deal. Especially under pressure. Personally, I have compiled a list of things that I am not compromising on:

  • Always reproduce the bug first. This might be a very complex, but if I cannot do it, how can I check my solution?
  • Always test all use-cases myself. While one might decide that QA by the dedicated team can be skipped, I am never skipping my own testing.
  • Always take a moment to think through the security implications. Never leave a security flaw in place with the note to fix it later.
  • Never cut a corner that I know I cannot uncut. It’s ok to write messier code under pressure, but only if there really is a way to go back and fix it.

I guess it ultimately boils down to being ok with the solution you provide. Not thrilled by it, not even happy, but at least ok with it. If a developer sees the code a few months later, he should be able to say that the solution is good enough.

There are of course a lot of other things I can compromise on:

  • putting in extra hours to see the project through.
  • implement a suboptimal solution knowing that I can go back and fix it later. Of course, with the client’s approval/understanding of the compromise.
  • hard coding, code duplication, coding standards, not-so-performant solutions and everything else related to code quality, as long as the solution is still ok-ish.

Even the above compromises do not play well long-term. While they will not do major damage at any specific point in time, they add tech debt that makes the project harder and harder to work on. Each change becomes more expensive and error-prone. If the client is always in rush mode it’s ok to warn them a few times and at some point, look for another project/job. Leaving a project is never easy, but I prefer that to knowing I was the guy that slowly pushed it over the point of no return.

A case against one step checkouts

Magento 2 provides a versatile checkout process, mainly consisting of two steps – shipping and billing. For whatever reasons, a fair amount of merchants are trying to get away from it and use a one step checkout –  a checkout that combines both steps into one. The reasoning seems to be that presenting all information at once makes it easier to checkout.

However, I have seen a lot of merchants that invested in such a checkout, only to revert back to the original one after a while. I think there are more reasons that contribute to this.

Usability

  • combining more steps into one implies more fields on the same screen. A wizard-like approach is cleaner.
  • the customer still needs to fill in the same amount of fields.
  • unexpected changes in fields. One might fill in the card details, then change the shipping address to a region where you don’t take cards. Their payment info must be re-entered.

Implementation

  • A lot more ajax requests. While this can be mitigated by a proper implementation, but that’s not always the case.

Maintenance

  • You open up the way to a lot more ways one can checkout, making testing more difficult.
  • All one step checkout extensions are a bit heavy, Magento upgrades become harder.
  • Other checkout-related extensions might not work without even more changes.

Unknown

  • There must be a reason why Magento (and even Shopify), ship with a multi-step checkout. I am not that familiar with the research that led them on this path, but I assume it was not accidental.

On the other hand, I am curious on whether you have more information on when a one step checkout is a good solution for Magento.

Logging sftp activity

Logging SFTP activity can be done (on most Linux systems) by editing /etc/ssh/sshd_config . Simply find:

Subsystem sftp /usr/libexec/openssh/sftp-server

And change to:

Subsystem sftp /usr/libexec/openssh/sftp-server -l INFO

Then restart the ssh daemon:

systemctl restart sshd

The info log level is just one of many, there are others, like VERBOSE, DEBUG etc, but usually INFO is a good compromise. To see the logs simply tail /var/log/messages:

tail -f /var/log/messages | grep /the/directory/i/care/about

Composer artifacts – an easy way to include composer packages with no repos in your code

As a Magento developer, I always prefer to add 3rd party extensions using composer. Sadly, a fair amount of vendors still provide archives with their modules instead of composer-ready repositories. Creating separate private repositories to keep each extension and then including them in my project seems like an overkill and it looks that there is a better solution for this use case – composer artifacts.

The idea is pretty simple – create an archive with your composer-ready code, add it in a directory of your main project, then simply ask composer to use it.

As an example, let’s assume we have a fictional module that has the following composer.json file:

{
"name": "paul/artifact",
"description": "this is an artifact",
"minimum-stability": "stable",
"version": "1.0",
"license": "proprietary",
"authors": [
  {
    "name": "test",
    "email": "email@example.com"
  }
]
}

The only part to keep in mind is the name of the package, paul/artifact in this case. To use it, create a zip archive of the code (including the composer.json file) then add it to your project in a subdirectory, i.e. artifacts/. The name of the zip archive is irrelevant.

In your main project, you can make composer aware of the artifacts directory by adding the following new repository type:

"repositories": [
    .....
    {
      "type": "artifact",
      "url": "artifacts"
    }
  ],

 
The artifact type is what will make composer search for artifacts as zip archives as opposed to pulling from source control repos. The “url” is the name of the directory where the zip archives are, relative to your project root.

Once this is done, you simply require the package as you always do:

"require": {
    "paul/artifact": "1.*"
  }

This brings up all composer’s features – it will install dependencies, lock versions etc. There is virtually no difference between an artifact package and any other type of package.

A small gotcha – if you want to update the package (say you zipped up a new version), don’t forget to clear composer’s cache (by running composer cleancache). Composer caches the artifacts exactly as it caches remote repos, so if you are not changing version numbers you have to intentionally clean caches so that your new version is picked up.

Hope this saves you some time.

The current state of the Magento Marketplace

One of the best reasons to use Magento are the community extensions. Free or paid, they augment Magento’s core features and save a lot of development time. Magento always supported this effort by offering a curated extension list. In the old days, this was called Magento Connect. It was simply an extension directory, Magento did not intermediate support or purchasing the extension. It still had value though as it did include a review list which of course, was more relevant than the 5-star reviews on the vendor’s site.

A short history of the marketplace

Magento Connect had a lot of down sides though. The approval time was very long. All the focus was on the marketing. You had to have pretty images and descriptions, but you could get away with very low quality, or even broken code. What was the worst though is that Magento heavily promoted the idea that you can go to the marketplace, get an extension, upload it via SFTP to your server and use it. Magento was not (and it isn’t to date), that kind of system. This resulted in a large number of bad reviews from non-technical merchants (“This extension breaks your site”, “I am not able to upload, PLS HELP!”, “Not good! Do not USE!”). Magento’s approach frustrated the open source community. It’s one thing to charge for the extension and provide support to merchants that cannot afford a developer. A totally different thing to provide free software, but having to deal with installing the extension/fix incompatible merchant stores. This resulted in a large number of open source developers simply pulling off their extensions from Magento Connect and keeping them on Github only, where the audience is more technical. I found myself searching for an extension on Github first and on connect second, which defeated the whole purpose of the effort to have an extension directory.

Fast forward to Magento 2, the approach was changed completely and Magento Connect was replaced by Magento Marketplace. There were major improvements right from the start:

  • Magento intermediates the purchasing (and charges a fee to the extension vendor). Now I can at least address Magento for a refund instead of obscure vendors.
  • You can no longer post a review unless you actually purchased the extension.
  • Better experience overall (more relevant searches, more relevant extension description pages to name a few).

What did not improve from the start was the extension quality. Actually, initially the quality was worse than on the old Magento Connect. Probably Magento needed to have a big number of extensions to offer, so they accepted pretty much anything (as long as you had pretty pictures and descriptions!). Vendors tried to release their Magento 1 extensions for Magento 2 asap, ignoring all coding standards and architectural approaches.

Luckily, Magento worked hard and improved this. Here is how the current process looks like:

Screen Shot 2018-11-04 at 11.32.32 AM

First, there are now two parallel, completely separate tracks – Marketing and Technical.

Marketing

The marketing flow is about the pretty pictures and description. Magento actually made this part really useful…

  • you need to add screenshots. Really important as they are the easiest way to understand quickly if the extension does what you need it todo.
  • the description has stricter guidelines so that it’s focused on the feature set, not on the sales pitch.
  • you have to submit a installation and user manual.
  • you have to clearly specify Magento version compatibility and pricing, including whether the extension relies on another third party subscriptions.

Technical

This is very important for a developer/agency. Personally, I try my best to write good quality code in my projects. Then the merchant buys an extension from the official marketplace and I am puzzled at how low quality it is. Or at least, this is how it used to work. Now there is a technical review. Mainly, it has three steps:

  • an automated code sniffer. It catches quite a few things. It even caught a few things in my code even I consider myself “seasoned”. While it’s still just an automated checker, you cannot do blatant mistakes anymore.
  • a Varnish test. Basically, check that the extension does not break caching. I had to ask for refunds on extensions in the past due to their architecture simply disabling caching and relying on it.
  • a manual QA test. While I am not sure what happens there, I like to think that a real person actually checks the basic flows of the extension and maybe looks over the code too.

I am sure the above works. First, there is no way to bypass the review process. If the automated code tester finds issues, you have to fix them. Then, I can simply feel how the quality has increased. It’s becoming the exception that I buy an extension and have a bad experience. Actually, I am currently only using the marketplace to buy extensions as I trust Magento’s reviews. At least for me, the Magento-owned marketplace concept finally worked.

Why is it better

Besides the above, there are a few not-so-obvious improvements that really helped:

  • the manual review team seems a bit more trained. I did not get my extensions rejected for silly reasons in a while.
  • the review process is faster. No more waiting for months.
  • the documentation is better on how the submission should look like, at least on the marketing side.

What’s still missing

While the extension markeplace is better, it’s still a long way from great imho. Here is what I’d like to see in the future:

  • A Magento-owned demo site so I can check the extension before buying. The vendors now take care of the demos, but not all of them do it properly.
  • A git repo for the extension. Being able to see the commit history helps me a lot.
  • Magento should own security issues. Sadly, vendors do a poor job at communicating a security issue. I’d like to be able to submit a form to Magento when I have one, Magento should work with vendor to correct, then all extension owners should be notified. This is left at the discretion of the vendor now. Most of them simply release a new version but forgot about the patch, or even about mentioning the upgrade fixes a critical security issue.
  • As an extension vendor, I’d love to see subscription-based pricing.

Conclusion

In the last year, I started to trust the marketplace as an authoritative entity in the extension world. While there are a few things to improve, Magento is definitely moving in the right direction. I expect that by then end of 2019, we will have an even better marketplace.

Magento integrations using Xtento

One of the most common tasks I have to deal with in Magento are integrations. Almost any system I worked with needs to push order or product data to an ERP or marketing software, or get stock/product info/tracking numbers from an external source. Every integration is unique in a few ways…

  • the communication channel. REST, SOAP, sending files via SFTP are common variations
  • the data format. JSON, XML, CSV to name a few
  • the actual data mapping. We either pass fields as-is or combine/transform them.

However, any good integration has a lot in common:

  • a log must exist so we can refer back to when and how each piece of information was synchronized
  • errors must be logged, with an alert system so we are aware of any failures
  • the system should be able to queue data in case the integration is down. We cannot lose info especially when dealing with financial records.
  • the system must be able to retry failed records
  • the actual field mapping must be easy to change, ideally without changing code

I have been looking for good solution to build integrations for a while. On the lower end, there is the do it yourself custom shell script. Easy to build, but usually missing key elements, like retries or flexible data mapping. On the higher end, we have full ETL solutions. They tend to be expensive and add new software components to the mix.

Almost accidentally I stumbled upon Xtento’s Import/Export suite – https://www.xtento.com/magento-2-extensions.html . They flew under my radar as I had bad experiences in the past with such extensions and for a while, concluded that the best import/export is the one you built on top of Magento’s default.

Let’s go over the steps involved to export orders from Magento via Xtento. First, one starts with defining destinations. A destination is the place where your orders will go. You have a few options:

  • on the local server in a specified folder
  • on a remote server via a SFTP or FTP connection
  • to an email
  • to a HTTP server (i.e. REST)
  • to a web service. Use this for XML-RPC or SOAP integrations
  • via a custom class, where you can define the communication with your exotic system

So the above options should cover any possible target. One nice thing is that you can have multiple destinations, so you could place the orders on a SFTP but also mail a copy of the files to you for later review.

After defining the destinations, the next step is to define the export profile. Most options are obvious so I will go only over the important ones:

  • you can choose the entity to export, i.e. orders. Usually each exportable entity corresponds to a specific extension that you need to buy from Xtento.
  • you can define filters. For example, you can decide that you only want to export “processing” orders, keeping the “pending” ones in queue until they are reviewed
  • You can define the export mapping as an XSLT template. This was the feature that I was sold on. XSLT templates allow you to be as expressive as you need to. You can use all the standard fields, apply transformations, use all the fields in the order/related tables (your custom ones too). All this with a nice autocomplete interface and a dead-simple way to test the template with an existing order. Once you get the hang of it, you almost never need to refer to the docs/examples, it’s that easy. If you do need help though, https://support.xtento.com/wiki/Magento_2_Extensions:Magento_Order_Export_Module#XSL_Template has you covered.
  • You can define when your export runs. Do you want to export orders on schedule? How often? Do you export all orders in the last N hours or only what’s new? If the export process is time consuming, you have a CLI command to run it outside the main Magento cron.
  • You have a flexible manual export option in case you need to replay any of the missed exports, or simply test the process.

Everything comes with logging. You have filterable grids where you can see the exported entries and errors. You also have the option to locally store the export files for later review.

If you need to import entities, Xtento has you covered too, The process is very similar in that you still have sources where you can pull from, profiles you can define, same logging capabilities, a way to map the data. In addition to imports, you have an actions section where you can define what happens when an entry is imported. For example, when you import a tracking number, you can have Xtento ship and bill the order automatically.

I should mention that currently Xtento does not offer a product import solution. You can import stocks, but not product data. I’d love to see that on their offer sometimes.

What I really like about the extensions is that they are developer-friendly. Almost everything in their system has a fallback to a custom class. You have a very exotic export destination? You can define a class to implement the communication logic. Need to map your data in a way the XSLT template does not support? You can define a class and a method just for that. Finally, having logs for all operations make it easy to identify random issues. It scales ok too, I have been exporting importing 100k records per day with no performance issues.

Here are a few usecases where I have used Xtento successfully, usually without writing any line of code:

  • product exports to marketplaces/product aggregators. I still prefer specific extensions for big marketplaces, like Amazon or Google Shopping, but will use Xtento for the ones that have poor extensions or none at all.
  • pushing order data to ERPs and getting back tracking info, billing and shipping automatically. That’s a huge one, before Xtento I used to spend a lot of time on these type of implementations.
  • pushing order and product data to marketing software, like mailing list managers.
  • importing stocks from an external source (usually ERP).

Xtento might be lacking all the bells and whistles of an ETL solution, but in an ecosystem where not everyone has a fortune to spend on an integration, their extension suite is more than decent to get things done.

Algolia and Magento 2 – a perfect match

Magento 2 has never been ultra-fast. Even with careful customisation and not a very large catalog, it still takes somewhere near 2s to render the average category view page. This slowness is usually made better by the full page cache, but navigation and filtering patterns are too different to hope that in a browsing session, a user will hit only cached pages.

Now, 1s-2s server-side generation time is not too bad. But more and more merchants expect an almost-instant browsing experience. Magento is hard at work with the PWA implementation, which should deliver a very smooth experience, but that’s still in progress. And when it’s done, time will be needed for the extension vendors to catch up. In my opinion, all the approaches for Magento PWA, either official or community based, call for a big implementation budget.

And here comes Algolia. Stop now and go here, check the load speed as you filter through the left-hand navigation. Almost instant.

Algolia is a search service. They integrate with Magento via an open source module. Basically, it will replace all your search and category pages with an empty catalog page that at load time, will do a client-side call to Algolia, get a json-formatted list of products, then use html templates that are under your control to render the results. So the initial page load time includes Magento bootstrapping (which should be under 500ms), plus a few hundred milliseconds for the Algolia call. Even better, the navigation is ajax-based, so any further filtering is almost instant. Of course, they also have very relevant (and customisable) search results.

That’s a huge selling point for me. While I still have to deal with performance issues on product detail pages, cart and checkout, using Algolia makes catalog navigation ultra-fast. And it takes a lot of load from the server, since we don’t process the catalog there.

All this sounds great, but there are a few reasons for which you may not be able to use it.

  • It is a paid service. Their pricing scheme is good though, I am almost positive the increase in conversion rate will pay off. Not to mention the countless hours you’d spend in trying to optimize the catalog page, if you get to that point.
  • It is a SaaS. You are fully dependent on them. When I was working on a project, my free trial ended and the site instantly became unusable. That got me thinking that an Algolia outage will break my site. But they look well funded and have a great uptime.
  • If you have an existing project, it’s way harder to move to Algolia. Since they replace all templates on the catalog pages, any skinning/custom features will be gone. Of course you can re-do them using their templates, but it will mean a lot more than just installing the Algolia module. Don’t expect any community extension that touches the category page to work out of the box either.

I also had to customise Algolia a bit and luckily, there are options. Roughly, the module works by creating a product indexer that pushes data to Algolia. Via the admin, you get to decide what attributes are searchable, filterable etc. All changes to products will be pushed at indexing time. When rendering the results, you have access to the json and can render them as you wish using html and a proprietary markup language.

Overall, it’s very easy to change CSS styling and add more attributes or custom logic, but you should be aware that at render time, you don’t have access to the product collection, so you cannot call Magento for any logic. Or well, I guess you could, but it defeats the purpose of using Algolia. The only approach should be to send any data you need at index time (even precalculate some attributes if needed), then simply display them in the templates. Algolia has you covered for default more complex features, like different pricing per customer group, but I can see how things could get complicated if you have custom logic in displaying data based on the current customer’s session. Still doable though.

All in all, Algolia is in my top 3 recommendations for a new Magento 2 project. It’s also in the top 3 recommendations for websites that have performance issues. I do hope that Magento will provide a similar experience based on elasticsearch once PWA is finalised, but till then Algolia is one of the best integrations you can have for your store.