Data Streaming With Kafka

2y ago
24 Views
2 Downloads
1.73 MB
19 Pages
Last View : 17d ago
Last Download : 3m ago
Upload by : Sutton Moon
Transcription

Data Streamingwith KafkaNovember 11, 2020

BUSINESSTECHNICALThree initiatives to turn your Siebel into the best non-SaaS CRMTACTICALSTRATEGICALKAFKACLOUD NATIVEAdoptReal-Time Data Streamingwith KafkaModernizeyour Siebel Platform withCloud Native ArchitectureSIEBEL UXSIEBEL UXImproveyour Siebel UX withNexus Face and Nexus AppsTransformyour Siebel UX withNexus Face and Nexus Apps

KAFKAAdopt Real-TimeData Streamingwith Kafka

Siebel CRM Cloud Native Sessions: https://bit.ly/SiebelCloudNative

Kafka comes with the Siebel Cloud Native Architecture

Siebel Kafka integration: background and demohttp://bit.ly/SiebelKafka35'20'' - 45'50''

01TECHNICAL CHARACTERISTICS Delivers messages with very low latency Scales to a thousand brokers and trillions of messages per day Stores streams of data safely in fault-tolerant cluster Processes streams of events in real timeKAFKA02SIEBEL RELATED NOTES Expected in 2021; update to the latest version is required Publish and subscribe to events declaratively and using scripting Siebel Cloud Native Architecture uses Kafka internally Now Siebel can work with Kafka using CDC or customConnectors03USE CASES Audit of the customers’ data access events Speed up queries: replace read calls with data syncs Google-like search: push Siebel data into Elasticsearch Identify and act on risks and opportunities in Siebeldata updates using Streams API

Data streaming with Kafka: Real-time performance and reliabilityProducers/Source SystemsSocial mers/Target SystemsData Lakes / DWHClicks StreamsSearchSensor DataAuditMobile AppsMobile AppsMicroservicesMicroservicesSaaS AppsSaaS AppsEnterprise AppsStream Processors(Kafka Streams API,ksqlDB, Spark)Enterprise Apps

Common Kafka Usage Scenario for Siebel applicationsAudit: recorduser actionsand customers’data accesseventsQueryperformance:replace readcalls with datasyncsGoogle-likesearch: pushSiebel datainto ElasticsearchStreams API:identify and acton risks &opportunities inSiebel dataupdates

Speeding up queries with CQRS consumerqueriesqueryhandlerreadstorage

Speeding up queries in the Siebel CRM ecosystemAzureOn-premSiebel RESTAPImobile appAPI GatewaySiebel DB

Speeding up queries in the Siebel CRM ecosystemRead Performance GaincommandsSiebel RESTAPI Record count in DB, BC, CosmosDB Accounts 600k 550k 500kSiebel DB Assets: 85m 2.5m 1mOn-prem Flat by System ID – 1.2-1.7x Hierachy by System ID – 2.5-3.8xAzure Large Response Size – 5.5-13.5xWhen to applykafkamobile appAPI Gateway Pushing Siebel data into«Digital» applications Replacing Siebel VBC by datastreaming from source systemsqueriesREST APICosmosDBCurrent Kafka integrationoptions for Siebel Source connector – CDC (Debezium,GoldenGate, Striim, etc) Sync connector – REST API based

Implementing Google-like search in the Siebel UISiebel PM viaNexus BridgeSiebel ConfigSiebel DBNexus Facebuilt UIvisibilityreplicationsearchREST APIData AccessContolkafkaElasticsearch

Kafka is emerging as a middleware platform123SUPPORT OF VARIOSPATTERNSSINGLE SET OFTOOLSRELIABLE & SCALABLEINFRASTRUCTUREWhile Kafka’s coreintegration pattern is eventbased, it also supports Fireand-Forget, Publish /Subscribe, RequestResponse / RPC, Batch andother patterns.Kafka provides all requiredmiddleware components,e.g., messaging, storage,connectors, processing.How many products do youcurrently run in yourmiddleware stack?Kafka offers extreme scaleand throughput while beinghighly available. With thedecoupling of clients, itsolved the problems ofbackpressure or omparison/

Organizational implications of Kafka’s “Dumb pipes smart endpoints” approachESBTopicsEvent ProducerorSource SystemSinkConnector(if needed)SourceConnector(if needed)1st System Dev TeamESB Dev TeamEvent ConsumerorTarget System2nd System Dev TeamKAFKATopicsEvent ProducerorSource SystemSinkConnector(if needed)SourceConnector(if needed)1st System Dev TeamOperations teamEvent ConsumerorTarget System2nd System Dev Team

Stream processingPresent Kafka topics asProducers/SourceSystemsConsumers/TargetSystems Streams TablesTwo Kafka’s technologies ksqlDB (SQL like) Streaming API (Java, Scala)Capabilities Transformation of messages Filtering AggregationStream Processors(Kafka Streams API,ksqlDB, Spark) Joining topics Time windowingConsumers/TargetSystemsMain use-cases: real time action on Detected anomalies Prediction / / Analytics

Adoption of Kafka in the aming-benefits-increase-with-greater-maturity/

Your next stepsFind out if yourorganization is alreadyrunning Kafka1Watch recordingsfrom Siebel CRMVirtual Summit2Deploy and run Kafka inKubernetes cluster3Determine youruse-cases for Kafka45Adopt Event DrivenFramework and CloudNative Architecture

Siebel DB Siebel REST API mobile app API Gateway Azure On-prem. Speeding up queries in the Siebel CRM ecosystem Siebel DB Siebel REST API API Gateway REST API CosmosDB commands queries kafka mobile app On-prem Azure Read Performance Gain Record count in DB, BC, CosmosDB Accounts 600k 550k 500k Assets: 85m 2.5m 1m Flat by .

Related Documents:

Cassandra / Kafka Support in EC2/AWS. Kafka Training, Kafka Consulting Kafka Design Motivation Goals Kafka built to support real-time analytics Designed to feed analytics system that did real-time processing of streams Unified platform for real-time handling of streaming data feeds Goals: high-throughput streaming data platform supports high-volume event streams like log aggregation, user

only focus on Apache Kafka [26], but the RDMA design could be borrowed by other systems (§6). scalledaproducer that pushes records to containers called Kafka topics. A Kafka's subscriber, called a consumer, subscribes to Kafka topics to fetch

Author and contributor to various projects in the Kafka ecosystem including Kafka, Alpakka Kafka (reactive-kafka), Strimzi, Kafka Lag Exporter, DC/OS Commons SDK 3 / seg1o . Monitoring 37 Kubernetes Prometheus Grafana. @seg1o Monitoring 38 Strimzi exposes a Prometheus Health Endpoint with Prometheus

- Make sure we're all familiar with Apache Kafka - Investigate Kafka Connect for data ingest and export - Investigate Kafka Streams, the Kafka API for building stream processing applications - Combine the two to create a complete streaming data processing pipeline As we go, we have Hands-On Exercises so you can try this out yourself

Apache \Storm and Spark for real-time streaming data analysis. For more information about Apache Kafka, refer to the Kafka documentaion. Understanding Kafka Architecture. Apache Kafka is a distributed publish-subscribe messaging system and a robust queue that handles a high volume of data and enables you to pass messages from one end-point to .

Apache Kafka Overview Apache Kafka is a hot technology amongst application developers and architects looking to build the latest generation of real-time and web-scale applications. According the official Apache Kafka website "Kafka is used for building real-time data pipelines and streaming apps. It is horizontally scalable,

Apache Kafka is a distributed, high throughput and fault-tolerant publish/subscribe messaging system in the Hadoop ecosystem. It is used as a distributed data streaming and processing platform. Kafka topics are the units of message feeds in the Kafka cluster. Kafka producer publishes messages into these topics and a

Cliente de un Oracle Database 12cR2 hacia un tópico Cliente en Apache Kafka. Los pasos de la implementación fueron divididos en las siguientes 4 secciones: Instalación de Apache Kafka Oracle GoldenGate 12c en Oracle Database 12cR2 Oracle GoldenGate for Big Data Validación de Streaming de Datos I. Instalación de Apache Kafka