Technology Solutions

Stelo + Microsoft Azure

Stelo helps you connect data from Microsoft Azure

Stelo is a cost-efficient, low-impact, real-time data replication software that can deliver data easily and efficiently into Microsoft Azure environments.

Businesses require more flexibility as they continue transitioning away from maintaining on-premise data centers to adopting more complex data system ecosystems. You need reliable data ingestion and streaming for applications like predictive analytics, business intelligence, and client reporting. Stelo connects all source and destination types for every use case, so you can easily offload data processing, storage, and analytics from costly production systems.

Start moving operational data from your on-premise database to your Azure repository for use in data warehouse or delta lake applications. Moving data across a wide area network (WAN) is inherently associated with higher information delay levels due to limitations on how fast data can transmit across a network. Stelo compensates for this inherent high latency of WANs, providing a seamless experience. Our deployment method is designed to maximize performance for both WAN and LAN connections. Easily replicate millions of transactions per hour and transfer data sets over 1TB. No matter the use case, Stelo deployments are readily customizable with a focus on robust performance.

Related Resources

TECHNICAL DATA SHEET →

QUICK START GUIDE →

THE BEST DATA REPLICATION STRATEGIES →

SCHEDULE A DEMO

Connects From

Flexible Deployment Icon_Noun 4351350_Green Customizable

Anywhere-to-Anywhere

Avoid vendor lock-in. Stelo uses heterogeneous replication for bi-directional support across all source and destination types. Our open-standards approach allows us to remain vendor-agnostic while providing highly flexible deployment models.

Rapid Deployment Icon_Noun 3967969_Green Quick Setup

Rapid Deployment

Streamline your deployment plan without costly delays. Stelo typically deploys in less than a day and cuts production time down from months to only weeks.

 Time and Labor Icon_Noun 4636078_Green Easy-to-Use

Set It and Forget It

Simple installation with GUI interface, configuration wizard, and advanced tools makes product setup and operation straightforward, with no programming needed. Once running, Stelo reliably operates in the background without requiring dedicated engineering support to maintain and manage. Alter, add, and drop schema changes are replicated automatically.

Near-Zero Footprint Icon_Noun 1465960_Green Low Impact

Near-Zero Footprint

Our process provides ultra-low CPU load (less than 1% typical) to minimize production impact and avoid operational disruption. No source or destination software installation required. Only transfer data you need thanks to Dataset Partitioning.

Data Scalability Icon_Noun 1304652_Green Cost-Efficient

Unlimited Connections

A single instance can support multiple sources and destinations without additional licensing. The Stelo license model is independent of the number of cores to either the source or destination, so you only pay for the capacity required to support the transaction volume. Your data ecosystem can change over time without additional costs.

Multi-Threaded Loading Icon_Noun 3263022_Green Low Latency

Multi-Threaded Processing

Stelo leverages native data loading functions, and exploits multithreaded processing to provide fast, reliable performance for replicating multiple tables concurrently.

Data Warehousing: It's Time to Evolve Your Data Management Strategy

The “new frontier” of data ingestion goes beyond traditional data warehousing by enabling more choice. New open-source systems allow companies to efficiently leverage a range of heterogeneous tools while maintaining high performance. Companies like Microsoft set up cloud services like Azure to allow anyone producing Kafka data to connect and move data from relational sources to cloud and non-relational platforms such as Salesforce, Google Cloud, and Synapse. Many companies are taking advantage of these new technologies to efficiently stream data into cloud-based delta lakes while maintaining their existing data warehouse applications.

Others are completely offloading their in-house data center. Most data originates in online transactional processing (OLTP) systems supported by technology companies expert in managing high-volume transactions, but legacy systems weren’t designed to help users leverage emerging high-data application trends. Reporting on legacy databases can either reduce performance or require expensive hardware upgrades to offset the performance reductions. To address this challenge, many organizations offload their reporting to a data store, such as Microsoft SQL Server, that scales more cost-effectively. Read our full post to discover how real-time data integration affects data warehousing strategies.

READ THE BLOG

FAQ

Do you support my replication pairing?

Stelo takes full advantage of open standards such as DRDA, SQL, ODBC, and JDBC to maximize compatibility and interoperability within an enterprise network. We are an active member of The Open Group software industry consortium, which was responsible for the adoption of DRDA as an industry standard for database interoperability.

Currently, Stelo supports more than 30 ODBC databases and our Kafka interface can also be used to communicate with cloud-based streaming services such as Azure Event Hubs for Kafka, the Oracle Cloud Infrastructure Streaming service, Amazon Managed Streaming for Apache Kafka, and IBM Event Streams. Stelo can also populate Azure Data Lake Storage Gen2 (ADLSg2) and similar NoSQL data repositories.

Stelo continues to use our open standards approach to ensure that we meet emerging replication requirements. We are continually adding support for new technologies while supporting legacy systems. Stelo is designed to grow with your organization rather than lock you in to any specific database platforms.

Why is real-time change data capture important?

Analyzing a company’s ongoing transactional or operational data can provide key insights that drive business decisions and processes. Many companies offload these operations to other databases, data warehouses, or data lakehouses to reduce source system load and save costs. However, replicating dynamic data from multiple source databases to one or more destination application requires real-time change data capture (CDC) to ensure all updates are automatically and reliably reflected. In other words, business insights are only effective if data changes are accurately captured and delivered across the connected databases over time.

CDC is typically the first stage in the data ingestion process and represents the “extraction phase” in the ETL and ELT approaches. CDC strategies often demand a tradeoff between latency tolerance and change volume requirements, but Stelo’s approach provides the ideal balance between fast, high-volume performance and lossless data capture. Our process can simultaneously propagate a source system’s information changes and structure updates to destination platforms so that the data appears in the correct sequence at the same time. In addition, the cost-efficient method merges incoming data flows into one stream, as opposed to using many streams, which becomes expensive.

Does your software require a dedicated engineer?

Some data replication solutions require that you pay for a dedicated engineer, which substantially increases costs and suggests that the software may be complicated and difficult to use. Good replication software does not require a babysitter. Once running, Stelo reliably operates in the background without needing dedicated engineering support to maintain and manage.

Can Stelo build my data warehouse?

Yes, Stelo has an automatic method to build out data warehouses, so we can do the heavy lifting for you. It works well whether you're doing a migration up to the cloud or are continuing with ongoing replication.

We encourage you to configure Stelo to create the destination tables on the SQL Server. In part this is to ensure the proper mapping of change data captured on the source system and the destination table. Also, if the Customer's high availability (HA) software utilizes *AFTER images only for capturing change data, then Stelo must rely upon a synthetic column (the Relative Record Number or "RRN") to uniquely identify the row in the destination table, and it is unlikely that some tools such as the JDE tool would provide this column. Therefore, it is best to allow Stelo to create the destination, including this column for each table. Alternatively, if the HA software configures *BOTH images in the journals then we can dispense with the RRN column and just utilize any unique index (including a primary key) to identify rows.

Another issue that should be addressed in the planning stage is the use of Unicode data types on the destination SQL Server. We strongly encourage customers to use NCHAR/NVARCHAR data types for SQL Server as this provides the greatest fidelity when mapping data from IBM i EBCDIC to Windows and other non-EBCDIC environments.

Our recommendation is to allow Stelo to create the tables and exploit UNICODE data-types, and then use the tool to create any additional indexes.

Support Features

Solutions Icon_Green

Accessible Support

Quick support is available for training, troubleshooting, version updates, and data replication architecture. 24/7 Urgent Incident Support is included in annual subscriptions.

Solutions Icon_Green

Highly Experienced Team

Stelo’s technologists have more than 30 years' experience developing reliable data software. Whether you need basic support or have a tricky technical challenge, we can work with you to solve any problem.

Solutions Icon_Green

End-to-End Proficiency

Our team has detailed knowledge of every data platform we support and can troubleshoot end-to-end replication pairing in heterogeneous environments to ensure the pairings are working properly.

Solutions Icon_Green

Constant Evolution

Unlike some other solutions, Stelo won't go out of date. New source and target types are continuously added through active updates to stay compatible with emerging market requirements.

The Latest from Our Blog

Futureproofing Your Data Management Strategy with NoSQL and Data Streaming
Futureproofing Your Data Management Strategy with NoSQL and Data Streaming

Futureproofing Your Data Management Strategy with NoSQL and Data Streaming

Oct 31, 2024 12:12:35 PM 2 min read
How Stelo V6.3 Helps You Master Data Integration
How Stelo V6.3 Helps You Master Data Integration

How Stelo V6.3 Helps You Master Data Integration

Nov 28, 2023 7:45:00 AM 2 min read
Sunsetting: What to Do When Your Data Replication Tool is No Longer Supported
Sunsetting: What to Do When Your Data Replication Tool is No Longer Supported

Sunsetting: What to Do When Your Data Replication Tool is No Longer Supported

Aug 29, 2023 8:46:34 AM 3 min read
Unboxing Stelo V6.1: MERGE Support
MERGE Support

Unboxing Stelo V6.1: MERGE Support

Apr 25, 2023 10:03:00 AM 4 min read

Get Started

These three steps will help you ensure Stelo works for your needs, then seamlessly deploy your solution.

1

Schedule a Demo

Our expert consultants will guide you through the functionality of Stelo, using your intended data stores.

2

Try Stelo

Test the full capability of the software in your own environment for 15 days. No obligations.

3

Go Live

When you're ready, we can deploy your Stelo instance in under 24 hours with no disruptions to your operations.

SCHEDULE A DEMO