Skip to content

Commit 23126ea

Browse files
authored
Merge pull request #191 from lmkerbey-mdb/DOCSP-43999
(DOCSP-43999) Promo admonition.
2 parents b1720d2 + 9a0756c commit 23126ea

9 files changed

+109
-13
lines changed

snooty.toml

+5
Original file line numberDiff line numberDiff line change
@@ -16,6 +16,7 @@ toc_landing_pages = [
1616
]
1717

1818
[constants]
19+
atlas-sp = "Atlas Stream Processing"
1920
connector = "MongoDB Kafka Connector"
2021
connector-short = "Kafka Connector"
2122
connector-long = "MongoDB Connector for Apache Kafka"
@@ -25,6 +26,8 @@ kafka-connect-long = "Confluent Kafka Connect"
2526
avro-long = "Apache Avro"
2627
avro = "Avro"
2728
avro-converter = "Kafka Connect Avro Converter (Avro Converter)"
29+
aws = ":abbr:`AWS (Amazon Web Services)`"
30+
azure = "Microsoft Azure"
2831
protobuf-converter = "Kafka Connect Protobuf Converter"
2932
json-schema-converter = "Kafka Connect JSON Schema Converter"
3033
connector_version = "1.15"
@@ -34,6 +37,7 @@ connector_version_github_tag = "master"
3437
connector_kafka_version_major = "2"
3538
connector_kafka_version_minor = "6"
3639
connector_kafka_version_docs = "https://kafka.apache.org/{+connector_kafka_version_major+}{+connector_kafka_version_minor+}"
40+
service = "Atlas"
3741
sink-connector = "MongoDB Kafka sink connector"
3842
source-connector = "MongoDB Kafka source connector"
3943
sink-connector-title = "MongoDB Kafka Sink Connector"
@@ -59,3 +63,4 @@ jmx-port-mapping = "35000"
5963
sandbox-directory = "kafka-edu/docs-examples/mongodb-kafka-base/"
6064
win-sandbox-directory = "kafka-edu\\docs-examples\\mongodb-kafka-base\\"
6165
cluster = "MongoDB cluster"
66+
clusters = "MongoDB clusters"

source/includes/atlas-sp.rst

+9
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
.. note::
2+
3+
{+atlas-sp+} provides MongoDB-native tooling to
4+
continuously process streaming data, validate schemas, and
5+
materialize views into either {+service+} database collections or Apache
6+
Kafka topics.
7+
8+
To learn more about {+atlas-sp+}, see {+service+}
9+
`Stream Processing <https://www.mongodb.com/docs/atlas/atlas-sp/overview/>`__.

source/index.txt

+2-13
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ MongoDB Kafka Connector
1818
Security and Authentication </security-and-authentication>
1919
Monitoring </monitoring>
2020
Migrate from the Community Connector <migrate-from-kafka-connect-mongodb>
21+
Compare Kafka Connector and Atlas Stream Processing <kafka-connector-atlas-stream-processing-comparison>
2122
Troubleshooting </troubleshooting>
2223
How to Contribute </contribute>
2324
Issues & Help </issues-and-help>
@@ -47,19 +48,7 @@ offerings to host your {+kafka+} cluster and {+connector+}:
4748
- To learn more about the MongoDB Source Connector, read the `documentation <https://docs.redpanda.com/current/deploy/deployment-option/cloud/managed-connectors/create-mongodb-source-connector/>`__.
4849
- To learn more about the MongoDB Sink Connector, read the `documentation <https://docs.redpanda.com/current/deploy/deployment-option/cloud/managed-connectors/create-mongodb-sink-connector/>`__.
4950

50-
.. note::
51-
52-
You can also use Atlas Stream Processing, which is a MongoDB-native way to
53-
process streaming data by using the MongoDB Query API. It transforms the way
54-
that developers build modern applications.
55-
56-
Use Atlas Stream Processing to continuously process streaming data,
57-
validate schemas, and materialize views into either Atlas database
58-
collections or Apache Kafka topics.
59-
60-
To learn more about Atlas Stream Processing, see the
61-
`Atlas Stream Processing <https://www.mongodb.com/products/platform/atlas-stream-processing>`__
62-
product page or read the `docs <https://www.mongodb.com/docs/atlas/atlas-sp/overview/>`__.
51+
.. include:: /includes/atlas-sp.rst
6352

6453
What's New
6554
----------

source/introduction.txt

+2
Original file line numberDiff line numberDiff line change
@@ -20,3 +20,5 @@ Read the following sections to learn about the {+connector+}, {+kafka-connect+},
2020
- :doc:`Connect to MongoDB </introduction/connect>`
2121
- :doc:`Data Formats </introduction/data-formats>`
2222
- :doc:`Converters </introduction/converters>`
23+
24+
.. include:: /includes/atlas-sp.rst
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
.. _kafka-connector-atlas-stream-processing-comparison:
2+
3+
===================================================
4+
Compare {+connector-short+} and {+atlas-sp+}
5+
===================================================
6+
7+
.. toctree::
8+
:titlesonly:
9+
:maxdepth: 2
10+
11+
.. contents:: On this page
12+
:local:
13+
:backlinks: none
14+
:depth: 2
15+
:class: singlecol
16+
17+
This section provides a comparison of the feature sets of the MongoDB
18+
{+connector-short+} and {+atlas-sp+} to help you identify which tool
19+
best suits your use case.
20+
21+
.. list-table::
22+
:header-rows: 1
23+
:widths: 20 40 40
24+
25+
* - Feature
26+
- {+connector-short+}
27+
- {+atlas-sp+}
28+
29+
* - Supported Stream Processing Capabilities
30+
- To process streaming data handled by {+connector-short+}, you must
31+
do one of the following:
32+
- Extend the {+connector-short+} with SMTs or custom Java
33+
- Use or write external tooling
34+
- Available through the MongoDB aggregation framework, with
35+
extensions specific to stream processing.
36+
37+
* - Installation
38+
- Installation required either locally or on Confluent.
39+
- No installation required.
40+
41+
* - Connectivity Tooling
42+
- MongoDB Java Driver required.
43+
- Connection managed by {+service+}.
44+
45+
* - Hosting
46+
- Hosting required for your {+kafka+} cluster and the Kafka
47+
Connector. Use partner services, such as Confluent Cloud, Amazon
48+
Managed Streaming, or Redpanda Cloud when possible.
49+
- Stream processing functionality fully managed by {+service+}.
50+
Hosting required for your {+kafka+} cluster.
51+
52+
* - Windowing
53+
- No support for windowing. You must manually configure windowing
54+
with the Kafka Streams API or other external tooling.
55+
- Support for highly configurable
56+
:atlas:`windows <atlas-stream-processing/windows/>`.
57+
58+
* - Connection Type Support
59+
- - Kafka clusters
60+
- {+service+} databases
61+
- {+service+} collections
62+
- - Kafka clusters
63+
- {+service+} {+clusters+}
64+
- {+service+} databases
65+
- {+service+} collections
66+
- HTTPS
67+
68+
* - Security Features
69+
- - SSL/TLS
70+
- X.509
71+
- {+aws+} IAM
72+
- User must develop all other authentication tools
73+
- - SSL/TLS
74+
- X.509
75+
- VPC Peering with {+aws+}
76+
- Private Link with {+aws+} Confluent
77+
- Private Link with {+aws+} MSK
78+
- Private Link with {+azure+} Event Hub
79+
80+
* - Pricing
81+
- Pricing dependent on your hosting provider.
82+
- Hourly pricing managed by {+atlas-sp+}. Typical costs
83+
approximately 25% of the cost of the {+connector-short+}.

source/migrate-from-kafka-connect-mongodb.txt

+2
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,8 @@ The following sections list the changes you must make to your Kafka
1212
Connect sink connector configuration settings and custom classes to transition
1313
to the {+sink-connector+}.
1414

15+
.. include:: /includes/atlas-sp.rst
16+
1517
Update Configuration Settings
1618
-----------------------------
1719

source/quick-start.txt

+2
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,8 @@
1717
.. meta::
1818
:keywords: get started, tutorial, code example
1919

20+
.. include:: /includes/atlas-sp.rst
21+
2022
Overview
2123
--------
2224

source/sink-connector.txt

+2
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,8 @@ Sink Connector
1717
:depth: 2
1818
:class: singlecol
1919

20+
.. include:: /includes/atlas-sp.rst
21+
2022
Overview
2123
--------
2224

source/source-connector.txt

+2
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,8 @@ Source Connector
1818
:depth: 2
1919
:class: singlecol
2020

21+
.. include:: /includes/atlas-sp.rst
22+
2123
Overview
2224
--------
2325

0 commit comments

Comments
 (0)