Pacemaker
  • Editions
  • Pricing
  • Technology
  • Partners
  • Cases
  • Blog
  • Schedule a demo

Technology & Architecture

Integration in Magento

Pacemaker is a Magento module fully implemented in PHP - thus it has access to all processes running in Magento and all resources available. The connection to third party systems is usually done via standard interfaces like web services, CSV and XML files. This way additional connectors to other systems can be implemented quickly and easily if they are not already available.

Structure & Components

Pacemaker itself has a modular structure to ease maintenance and development. The strict separation of responsibilities additionally ensures that changes do not lead to unwanted side effects and cause difficulties with updates. The numerous modules can be grouped into the higher-level components:
 

  • Process Pipeline Framework
  • Import Framework
  • Pipeline-Based Imports
  • Order workflow

Adaptable and Expandable

All components of Pacemaker are implemented in such a way that they can be quickly and easily adapted to your own requirements and extended if necessary. The interaction between Process Pipeline and Import Framework is done by so-called executors, which start the import process e.g. from the step of a pipeline by calling a CLI command. Dependencies in the source code between the two frameworks were deliberately avoided.

The Process Pipelines Framework was implemented as a pure Magento extension and can be extended by new pipelines and steps using the possibilities offered by the Magento Framework. However, many adaptations can already be done by adapting the XML files on which the pipelines are based, which considerably reduces the programming skills required. The already delivered pipelines are a very good starting point for first steps.

In the case of the Import Framework, however, a different approach was taken. Since the Import Framework follows a generic and system-independent approach and can therefore also be used for other systems, the Symfony Framework was used as the basic technology. The Import Framework pursues a completely declarative approach and meanwhile allows a multitude of adjustments via configuration.

How was the Pipeline Pattern implemented?

In its implementation Pacemaker is inspired by role models like Jenkins, Gitlab-CI or Travis-CI. Likewise processes are modeled by configuration in XML format. Conditions can be used to define when a pipeline should be started. For example, an import pipeline should only be started when import files are available and if no indexing is in progress.

To execute a step (task), a message is transferred to the message queue. This message remains in the queue until it has been successfully processed by a runner. This ensures that messages are not lost even if a system fails, and also that messages can be processed asynchronously and decentrally.

Pipelines vs. CRON-Jobs

Pacemaker adapts the pipeline pattern, which proven itself in data processing for a long time. This opens previously unused possibilities for the Magento ecosystem. Through the integration as a real Magento module, developers have direct access to the processes running in Magento and can control and coordinate them. This increases transparency, performance and scalability of Magento by and large while it avoids problems such as inconsistent data.

 

Lightweight processes

Processes are broken down into any number of stages and mapped as a so-called pipeline. Each stage must contain at least one task, but may contain many. By breaking down the monolithic process into small steps, it becomes easier to manage, more transparent and maintainable.

Conditioning & prioritization

Configurable and flexible conditions can be used to control exactly whether and when a pipeline is started. In addition, dependencies can be defined between the pipelines and even across stages and their tasks. This makes it possible to exactly prioritize execution so to adhere to a defined sequence.

Asynchronous & parallel execution

By executing the pipelines with a runner, processes can be decoupled from one another and executed asynchronously. Since this decoupling takes place in a controlled environment, pipelines and tasks can be parallelized. This of course significantly accelerates processing.

Scalability & Decentralization

By communicating through messages in a queue, the executing runners can be installed on any number of servers. Distributing the processes over several servers, for example, depending on the available resources, offers unique possibilities for scaling and reliability.

Visualization & Monitoring

By visualizing the pipelines, stages and tasks via the CLI and the Admin UI, all running processes can be monitored with great ease. Especially in case of error, problems can be easily debugged, quickly detected and corrected. Individual steps can be repeated as often as required, even on other systems. 

What is the architecture of the Import Framework?

The Pacemaker Import Framework - besides the implementation of fast and memory-optimized import processes - seeks to achieve the goal of carrying out adaptations and extensions by declaration alone, without any programming involved. This declarative approach in combination with generic and reusable components offers a high time and cost saving potential. By and large, the components of the Import Framework encapsulate functionality for the import of the respective entities. The components can be combined with each other in almost any way. For example, the product import also uses the components for the import of attributes and categories to create them dynamically if they do not already exist.   

Advantages and functionalities of the Pacemaker Import Framework

The Pacemaker Import Framework comes with a number of powerful components that help you create very powerful, rock-solid and easy-to-use import services that are nearly 100% compatible with the standard CSV format of Magento 2.

 

Declarative approach

Adjustments can be made in a declarative manner to avoid unnecessary development effort and save money.

Workflow Engine

Configurable and powerful workflow engine that allows fine-grained adaption of the import process to the requirements of almost any application.

Archiving/Artifacts

Automatic archiving of processed artifacts after the import process is complete.

Bunch Support

Import entities from multiple source files in one step with a single transaction and rollback in case of errors.

Consistent data

Configurable cleanup functionality for relations, media gallery and empty fields for important additions or updates.

Multiple operations

Delete, add, update & replace import operations on supported entity types.

File Handling

Process multiple files by creating a smart ".OK-file" and moving import files to a temporary directory during processing.

Multithreading 

Designed to support multithreaded and multiprocess environments for massive performance improvements in a variety of scenarios.

CLI

Command Line Interface to invoke import commands and maintenance tasks.

Delta Import

Links, grouped products, variants and bundles can refer to data that is not part of the CSV file to reduce file size and improve performance.

Common Entity

Support for products, categories, customers, customer addresses and attributes + attribute sets/groups.

Product types

Supports simple, variant, bundle, and grouped product types and provides functionality for importing inventory, pricing, relationship, and media types.

Dynamic option values

Missing option values can be created on-the-fly, if necessary and desired.

Dynamic Categories

Missing categories can be created on-the-fly, if necessary and desired.

Dynamic image types

Configurable and dynamic image types allow an unlimited number of additional image types to be configured and processed.

Admin UI

Admin UI interface for calling and monitoring import commands.

Caching

Advanced and configurable fine-grained caching and cache warming functionality to improve performance and reduce database penetration.

Batch Processing

Batch processing and the use of multi-value SQL statements for batch processing to improve performance and reduce database penetration.

Time Stamp Detection

Use the entity timestamp to determine if the entity has changed since the last import; skip the row if not to reduce database penetration.

Change-Set Detection

Detect differences between CSV and database to process only changes, reducing database penetration.

Single Transaction

The option to run imports as part of a transaction ensures that the data in the store is consistent at all times.

Security and flexibility through pipeline based imports

A significant value-added is all the process know-how acquired in the course of many projects. Packaged in ready-to-use pipelines, Pacemaker customers can start immediately and have the certainty from the very first moment that their processes will run reliably and stable. If required, new functionalities can be easily integrated into an existing pipeline as an additional step via XML file. 

Catalog Import

One of the most common applications for Pacemaker is the integration of third-party systems. These are mainly PIM (Product Information Management Software) or ERP systems where data is either imported into Magento or transferred from Magento to the connected system. Especially for the import of the product catalog, which includes products, attribute sets, attributes and of course categories, Pacemaker comes with a pipeline that ensures that the catalog is updated if corresponding data is stored in a configurable directory. It is ensured that

  • the processed files are moved into a temporary directory
  • the indexing of the catalog is being stopped
  • the files are imported in the correct order
  • the file for the refresh of the image cache (an additional module is required for the actual update) is exported
  • and finally the indexing of the catalog is started again.

After this process the catalogue is up to date again. Throughout the entire process, the pipeline ensures that the shop remains online at all times and that customers can continue shopping without any limitations.
 

Stock Import

In addition to the catalogue, current stock levels are certainly one of the crucial pieces of information in day-to-day operations. In order to ensure that the stock levels are updated in the shortest possible time periods, Pacemaker is delivered with an extremely reduced pipeline, which carries out a delta update of the stock levels in short intervals. Since the processed file only contains SKU and stock, the runtime and thus the influence on the system goes without further notice.

Import of prices

Analogous to the prices, Pacemaker includes a standard pipeline for the import of prices. As with inventories, prices are part of the inventory data and often are subject to a higher update frequency. In order to ensure that the prices displayed in the shop are up to date, the price import also contains only the most necessary data and thus ensures that the runtime can be reduced to a minimum. 

Import of images & videos

Pacemaker also comes with a standard pipeline for the import of images and videos. The import of images and videos is often a complex requirement, especially due to the large amount of data. On the one hand, pictures can be imported with the products, on the other hand, a dedicated pipeline is available for this purpose. Both options bring along a workflow that is already widely used and can be quickly and easily extended with intelligent functions such as delta cache management (through the additional and chargeable module Image Cache).

Transmission of incoming orders 
to your existing ERP 

This component is the central logic for transferring the purchase order via flexible pipelines to the responsible Enterprise Resource Planning system (ERP), as well as for synchronizing the order status between the systems.

Order Export

This module uses configurable rules to determine whether an order should be exported. The conditions for the export can be linked to the payment method, for example, depending on the status of the order. If such an order is recognized, a corresponding export pipeline is being initialized. This process can be set individually for each website (client). The export pipeline is divided into up to four steps: 

  • Transformation
  • Transport 
  • Response
  • Notification. 

Depending on the target system, these steps can be deactivated or merged.

In the Transformation step, the Magento order is converted into a target format. Any file format can be easily defined with a template in the Magento backend. Specific formats can be extended programmatically through appropriate interfaces. For example, by means of the SAP plugin for Pacemaker, which we developed.

In the Transport step, the previously transformed order is transferred to the target system. In the standard system, the purchase order can be stored within the local file system. Using appropriate extensions, (s)FTP, REST, SOAP, etc. can be added as additional handlers. Depending on the type of transfer, the processing of the response from the target system can be processed directly within this step or with the next step.

The Response step processes asynchronous responses from the target system. In the standard system, a predefined file is imported that expects the request ID (of the target system). As with the steps described above, there are also numerous extension points to adapt the response processing to the respective target system. 

The Notification step enables you to send a notification as soon as the order has reached a corresponding status. This means, for example, that the order confirmation can only be sent once the order has also been accepted by the target system.
 

Line Items

Line Items is an extension that becomes active when the order is created in Magento. Additional attributes are used to number the individual items of the order according to pre-configured rules. This requirement results from the functionality of many ERP systems. With this module, the numbering is done explicitly when the order is created. This ensures a higher stability of the entire system (meaning the entire e-commerce infrastructure, not only the Magento instance) than with implicit numbering during the export process or the processing of the response.

With this module, numbering such as 0010, 0020, 0030, etc. or similar can be implemented by configration in the backend alone. Configurations can be individually designed for each client (website) and therefore also for each target system.

Hey - another message and guess what?

You were right. This website uses cookies, and we are obliged to inform you about:
 
1. Technically necessary cookies: One cookie to store your cookie preferences from this window and one cookie due to the integration of YouTube. We store these cookies without your consent, because it is technically necessary.

2. "Evil" tracking cookies from Google Analytics (IPs anonymized, of course) partially combined with "Optimize A/B Testing", so that we can adapt our website to your needs.

As long as you don't press "ACCEPT", we do not track you. Promise!

Detailed information in our Data Privacy Statement

P.S.: During the implementation of this website we intensively used chocolate cookies. Unfortunately all of them are already gone - we could need new one, please.
 
ACCEPTDENY

Versions

Price

Cases

Technology

Partners

Blog

Contact

Schedule a demo
Newsletter

Data Privacy Statement

Imprint

English
  • Deutsch
  • English