7/12/2023 0 Comments Download steep target![]() The Batch Aggregator component requires a streaming or size setting to indicate how to process records.Īdditional Batch Step components are optional. It is also possible to use components, such as For Each, that iterate over the array so that other processors can process the records individually. Batch aggregation is useful for loading an array of processed records to an external server. The initial processor within a Batch Aggregator must be able to accept an array of records as input. You can add only one to a Batch Step component. For example, you might configure a connector operation to pass processed records one-by-one to an external server.Ī Batch Aggregator component is optional. All record processing takes place during this phase.Įach Batch Step component contains one or more processors that act upon a record to transform, route, enrich, or modify data in the records. ![]() The batch job instance executes when the batch job instance reaches. In this phase, the component prepares the input for processing as records, which includes creating a batch job instance in which processing takes place. When the Batch Job component receives a message from an upstream processor in the flow, the Load and Dispatch phase begins. For example, an HTTP request operation might retrieve the data to process, and a DataWeave script in a Transform Message component might transform the data into a valid format for the Batch Job component to receive. ![]() Processors located upstream of the Batch Job component typically retrieve and, if necessary, prepare a message for the Batch Job component to consume. ![]() Common event sources are listeners, such as an HTTP listener from Anypoint Connector for HTTP (HTTP Connector), a Scheduler component, or a connector operation that polls for new files. The Mule event source triggers the Mule flow. ![]()
0 Comments
Leave a Reply. |