TL;DR – EcholoN Data Workflow System (ETL tool):
- Central ETL platform for extracting, transforming, and loading data from various sources (SAP, Oracle, cloud, etc.).
- No-code, graphical configuration, no programming required.
- Automated and flexible: Time- or event-driven data processes.
- Data quality & BI optimization through validated, consolidated data.
- Scalable even for large data volumes & cloud integration.
- Practical examples: Bünting, CLSAG, Hornbach – improved transparency, efficiency, and reporting.
- Benefits: Automation, clean data, less manual effort, faster decisions.
Features
- Data extraction: Extract raw data from various data sources such as databases, cloud services, and big data systems.
- Data quality: Ensure that the transformed data has the required data quality to enable sound analysis.
- Data management: Structure large amounts of data efficiently with our powerful ETL system.
- Data integration: Seamlessly connect different systems such as SAP, Oracle, and Salesforce.
- Flexibility: Support both ETL and ELT processes to meet different requirements.
- Automation: Automate recurring ETL processes to save time and resources.
- Data profiling: Analyze your data before transformation to improve data quality.
- Data from different sources: Take advantage of data integration to effectively extract, transform, and load data from different sources such as databases, the cloud, and data lakes into a target system. This flexibility allows you to combine and aggregate data from multiple sources, resulting in comprehensive analysis and a better overview of your data.
Advantages
With the EcholoN Data Workflow System, you benefit from a user-friendly interface, a wide range of connectors, and the ability to integrate and transform data from multiple sources. Our ETL tool is designed to optimize data integration and give you the flexibility you need.
- By using ETL tools, you can extract data from different sources and load it into a target system.
- Increase the efficiency of your data processing.
- Support for cloud solutions.
- Ability to process large amounts of data.
- Leading in the field of data management.
- Asynchronous extraction enables efficient analysis of transformed data.
- Storage in data warehouses or data lakes.
- Optimization of business intelligence through ETL and ELT.
- Improvement of data quality.
Technology
The system supports modern technologies such as Apache Hadoop, Azure Data Factory, and AWS Glue to ensure that your data can be processed efficiently, regardless of the volume.
- Integration of technologies enables real-time analysis of large amounts of data and the extraction of valuable insights.
- Apache Hadoop provides scalable architecture for storing and processing data across distributed systems.
- Azure Data Factory enables the creation of data pipelines to connect different data sources and seamless ETL processes.
- AWS Glue complements serverless data management for cataloging and preparing data for analysis without infrastructure management.
- Combining technologies integrates data in different formats and from different sources for more comprehensive business data views.
- Support for advanced analytics methods such as machine learning and artificial intelligence for predictive analytics and early trend detection.
Effective data management through ETL processes and data maintenance in one place
In most cases, converting data is no easy task. When integrating data, it is important to note that when transferring data from one system to another, it must be extracted, transformed, filtered, sorted, split, and/or assembled. ETL tools can play a crucial role here, as they automate the ETL process and improve data quality.
The challenge is to integrate data from central and decentralized systems easily and efficiently in order to make it available in a data warehouse or data lake. Tools such as SAP Data Services and AWS Glue require careful selection to ensure that the data obtained is structured in the target database and ready for analysis by business intelligence.
Data integration with ETL tool
The EcholoN Data Workflow System (DWS) is a universal system for data integration (DI) and system coupling. Information and data can be migrated and exchanged from one system to another, as well as imported and exported. Thanks to powerful ETL tools, you can efficiently extract, transform, and load data.
Data can be read from almost any data source and made available via the DWS. Whether once, periodically, daily, or weekly—you define at what time or for what event information and data from different sources should be exchanged. This flexibility enables connection to databases, cloud systems, and the aggregation of large amounts of data for effective analysis and business intelligence.
Among the most commonly used data sources include
- Files like Excel, XML, text files in CSV / SDF format, etc.
- SQL as a generic connection to MS SQL, Oracle, MySQL etc. or via ODBC
- Directory Services such as LDAP, Active Directory Service (ADS), Novell eDirectory (NDS), Microsoft Entra ID etc.
- SNMP for scanning the network infrastructure
- Tools specific like: PRTG, NAGIOS, ZABBIX, HP SIM, WhatsUP, etc.
- Standard Web Services with SOAP and WSDL or REST / JSON
- System specific like: SAP via BAPI and Web Service
- Industrial and manufacturing systems M2M via MQTT
Data Workflow System in practical use
The graphical user interface makes connecting the individual fields intuitively: on the left, the elements of the data source appear, and on the right those of the destination. Now click on a name and drag it to the right - already a connection that connects the two elements with each other appears.
In the first mapping, the search determines which data records should be taken into account based on which criteria. In the second mapping, which elements are passed on during import and synchronization. Optionally, the third mapping determines what should happen if an already synchronized object no longer exists in the source (deactivated / deleted).
The EcholoN DWS is executed as a service and processes the defined tasks directly or at the specified time. In addition, individual task parts or entire workflows can also be called up manually via the console. This offers not only the output of the current activities but also the generation of a logfile.
Connection of a monitoring system, for example Nagios
The goal of a connection is that alarms and messages automatically generate a process to process them, for example, automatically in EcholoN. The following description shows two common ways: via e-mail and at the database level.
The connection via e-mail takes place in three steps. First, Nagios is configured to send "messages" to a specific queue. In the second step, the queue is linked via DWS to EcholoN. The DWS processes the data from the queue (cyclic polling of the queue, parsing of the message, etc.). In the last step, the feedback is sent to Nagios.
The connection at the database level is a bit more comfortable. The prerequisite is that Nagios writes the messages to a central database (for example, Oracle, MS SQL - Server, MySQL) with a corresponding module (for example, PRTG, NagiosEventDB). The DWS is connected directly to the database. Here both the handover to an echoing process and the sending of "acknowledgments" take place.Data is extracted from various sources and further processed in the ETL process according to the use case.
Customer references & use cases ETL tool EcholoN Data Workflow System
Bünting – Efficient IT processes and transparency in the hardware lifecycle
The Bünting retail group, with 14,000 employees, was faced with the challenge of outdated helpdesk systems, non-transparent processes, and a lack of overview of hardware and inventory movements. EcholoN was introduced as a central, scalable enterprise service management platform. The Data Workflow System (DWS) took over the migration and ongoing import of large amounts of data, for example from Active Directory or CSV imports for hardware deliveries. Automated workflows now control hardware replacement, repairs, and returns – including automatic delivery note creation and cost center posting. Branches are actively involved via self-service portals and knowledge databases. The result: significantly improved transparency, fewer manual errors, faster processing of hardware cases, and a solid reporting basis for control and planning.
Cash Logistik Security AG (CLSAG) – Quality improvement through centralized service management
CLSAG, a cash logistics service provider, suffered from a lack of transparency and inconsistent processes due to scattered tools and manual lists. EcholoN bundled all service processes into a centralized, process-oriented platform. Requests are now recorded centrally via a single point of contact and distributed in a structured manner to service groups such as complaints or technical support. Interfaces connect EcholoN with external systems, e.g., for the exchange of asset data with partners and manufacturers. Standardized reports, templates, and guidelines increase efficiency. The result: measurable quality improvements, transparent processes, better resource management, and greater ability to provide information to customers—with potential for further automation and expansion.
Hornbach—Modernization of IT support
The DIY store group Hornbach replaced an outdated helpdesk system that offered neither interfaces nor modern functions. With EcholoN, a flexible service management platform was introduced, including a data workflow system for cleaning up and migrating incorrect legacy data such as cost centers and installation locations. A self-service portal enables employees to independently record and track requests. Central mailboxes and mail connectors automatically classify and assign requests. Individual workflows and interfaces create an integrated, future-proof solution. The result: greater transparency, more efficient processes, better user acceptance, and a basis for future expansions, for example in logistics processes.