Skip to content

Commit 17acfcb

Browse files
committed
Merge Input and Output plugins into Integration plugin
2 parents 32b039b + b99d69d commit 17acfcb

23 files changed

+1968
-800
lines changed

.gitignore

+4-1
Original file line numberDiff line numberDiff line change
@@ -7,4 +7,7 @@ lib/log4j/
77
lib/net/
88
lib/org/
99
vendor/
10-
build/
10+
build/
11+
.idea/
12+
vendor
13+
*.jar

.travis.yml

-3
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,3 @@
1-
---
21
sudo: false
32
language: ruby
43
cache: bundler
@@ -12,8 +11,6 @@ matrix:
1211
env: LOGSTASH_BRANCH=6.7
1312
- rvm: jruby-9.1.13.0
1413
env: LOGSTASH_BRANCH=6.6
15-
- rvm: jruby-1.7.27
16-
env: LOGSTASH_BRANCH=5.6
1714
fast_finish: true
1815
install: true
1916
before_script: ci/build.sh

CHANGELOG.md

+6-170
Original file line numberDiff line numberDiff line change
@@ -1,170 +1,6 @@
1-
## 9.1.0
2-
- Updated Kafka client version to 2.3.0
3-
4-
## 9.0.1
5-
- Added support for `sasl_jaas_config` setting to allow JAAS config per plugin, rather than per JVM [#313](https://github.com/logstash-plugins/logstash-input-kafka/pull/313)
6-
7-
## 9.0.0
8-
- Removed obsolete `ssl` option
9-
10-
## 8.3.1
11-
- Added support for kafka property ssl.endpoint.identification.algorithm #302(https://github.com/logstash-plugins/logstash-input-kafka/pull/302)
12-
13-
## 8.3.0
14-
- Changed Kafka client version to 2.1.0
15-
16-
## 8.2.1
17-
- Changed Kafka client version to 2.0.1 [#295](https://github.com/logstash-plugins/logstash-input-kafka/pull/295)
18-
19-
## 8.2.0
20-
- Upgrade Kafka client to version 2.0.0
21-
22-
## 8.1.2
23-
- Docs: Correct list formatting for `decorate_events`
24-
- Docs: Add kafka default to `partition_assignment_strategy`
25-
26-
## 8.1.1
27-
- Fix race-condition where shutting down a Kafka Input before it has finished starting could cause Logstash to crash
28-
29-
## 8.1.0
30-
- Internal: Update build to gradle
31-
- Upgrade Kafka client to version 1.1.0
32-
33-
## 8.0.6
34-
- Fix broken 8.0.5 release
35-
36-
## 8.0.5
37-
- Docs: Set the default_codec doc attribute.
38-
39-
## 8.0.4
40-
- Upgrade Kafka client to version 1.0.0
41-
42-
## 8.0.3
43-
- Update gemspec summary
44-
45-
## 8.0.2
46-
- Fix some documentation issues
47-
48-
## 8.0.1
49-
- Fixed an issue that prevented setting a custom `metadata_max_age_ms` value
50-
51-
## 8.0.0
52-
- Breaking: mark deprecated `ssl` option as obsolete
53-
54-
## 7.0.0
55-
- Breaking: Nest the decorated fields under `@metadata` field to avoid mapping conflicts with beats.
56-
Fixes #198, #180
57-
58-
## 6.3.4
59-
- Fix an issue that led to random failures in decoding messages when using more than one input thread
60-
61-
## 6.3.3
62-
- Upgrade Kafka client to version 0.11.0.0
63-
64-
## 6.3.1
65-
- fix: Added record timestamp in event decoration
66-
67-
## 6.3.0
68-
- Upgrade Kafka client to version 0.10.2.1
69-
70-
## 6.2.7
71-
- Fix NPE when SASL_SSL+PLAIN (no Kerberos) is specified.
72-
73-
## 6.2.6
74-
- fix: Client ID is no longer reused across multiple Kafka consumer instances
75-
76-
## 6.2.5
77-
- Fix a bug where consumer was not correctly setup when `SASL_SSL` option was specified.
78-
79-
## 6.2.4
80-
- Make error reporting more clear when connection fails
81-
82-
## 6.2.3
83-
- Docs: Update Kafka compatibility matrix
84-
85-
## 6.2.2
86-
- update kafka-clients dependency to 0.10.1.1
87-
88-
## 6.2.1
89-
- Docs: Clarify compatibility matrix and remove it from the changelog to avoid duplication.
90-
91-
## 6.2.0
92-
- Expose config `max_poll_interval_ms` to allow consumer to send heartbeats from a background thread
93-
- Expose config `fetch_max_bytes` to control client's fetch response size limit
94-
95-
## 6.1.0
96-
- Add Kerberos authentication support.
97-
98-
## 6.0.1
99-
- default `poll_timeout_ms` to 100ms
100-
101-
## 6.0.0
102-
- Breaking: Support for Kafka 0.10.1.0. Only supports brokers 0.10.1.x or later.
103-
104-
## 5.0.5
105-
- place setup_log4j for logging registration behind version check
106-
107-
## 5.0.4
108-
- Update to Kafka version 0.10.0.1 for bug fixes
109-
110-
## 5.0.3
111-
- Internal: gem cleanup
112-
113-
## 5.0.2
114-
- Release a new version of the gem that includes jars
115-
116-
## 5.0.1
117-
- Relax constraint on logstash-core-plugin-api to >= 1.60 <= 2.99
118-
119-
## 5.0.0
120-
- Support for Kafka 0.10 which is not backward compatible with 0.9 broker.
121-
122-
## 4.0.0
123-
- Republish all the gems under jruby.
124-
- Update the plugin to the version 2.0 of the plugin api, this change is required for Logstash 5.0 compatibility. See https://github.com/elastic/logstash/issues/5141
125-
- Support for Kafka 0.9 for LS 5.x
126-
127-
## 3.0.0.beta7
128-
- Fix Log4j warnings by setting up the logger
129-
130-
## 3.0.0.beta5 and 3.0.0.beta6
131-
- Internal: Use jar dependency
132-
- Fixed issue with snappy compression
133-
134-
## 3.0.0.beta3 and 3.0.0.beta4
135-
- Internal: Update gemspec dependency
136-
137-
## 3.0.0.beta2
138-
- internal: Use jar dependencies library instead of manually downloading jars
139-
- Fixes "java.lang.ClassNotFoundException: org.xerial.snappy.SnappyOutputStream" issue (#50)
140-
141-
## 3.0.0.beta2
142-
- Added SSL/TLS connection support to Kafka
143-
- Breaking: Changed default codec to plain instead of SSL. Json codec is really slow when used
144-
with inputs because inputs by default are single threaded. This makes it a bad
145-
first user experience. Plain codec is a much better default.
146-
147-
## 3.0.0.beta1
148-
- Refactor to use new Java based consumer, bypassing jruby-kafka
149-
- Breaking: Change configuration to match Kafka's configuration. This version is not backward compatible
150-
151-
## 2.0.7
152-
- Update to jruby-kafka 1.6 which includes Kafka 0.8.2.2 enabling LZ4 decompression.
153-
154-
## 2.0.6
155-
- Depend on logstash-core-plugin-api instead of logstash-core, removing the need to mass update plugins on major releases of logstash
156-
157-
## 2.0.5
158-
- New dependency requirements for logstash-core for the 5.0 release
159-
160-
## 2.0.4
161-
- Fix safe shutdown while plugin waits on Kafka for new events
162-
- Expose auto_commit_interval_ms to control offset commit frequency
163-
164-
## 2.0.3
165-
- Fix infinite loop when no new messages are found in Kafka
166-
167-
## 2.0.0
168-
- Plugins were updated to follow the new shutdown semantic, this mainly allows Logstash to instruct input plugins to terminate gracefully,
169-
instead of using Thread.raise on the plugins' threads. Ref: https://github.com/elastic/logstash/pull/3895
170-
- Dependency on logstash-core update to 2.0
1+
## 10.0.0
2+
- Initial release of the Kafka Integration Plugin, which combines
3+
previously-separate Kafka plugins and shared dependencies into a single
4+
codebase; independent changelogs for previous versions can be found:
5+
- [Kafka Input Plugin @9.1.0](https://github.com/logstash-plugins/logstash-input-rabbitmq/blob/v9.1.0/CHANGELOG.md)
6+
- [Kafka Output Plugin @8.1.0](https://github.com/logstash-plugins/logstash-output-rabbitmq/blob/v8.1.0/CHANGELOG.md)

CONTRIBUTORS

+3
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,9 @@ Contributors:
88
* Richard Pijnenburg (electrical)
99
* Suyog Rao (suyograo)
1010
* Tal Levy (talevy)
11+
* João Duarte (jsvd)
12+
* Kurt Hurtado (kurtado)
13+
* Ry Biesemeyer (yaauie)
1114

1215
Note: If you've sent us patches, bug reports, or otherwise contributed to
1316
Logstash, and you aren't on the list above and want to be, please let us know

DEVELOPER.md

+62-8
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,24 @@
1-
logstash-input-kafka
2-
====================
1+
# logsstash-integration-kafka
2+
3+
Apache Kafka integration for Logstash, including Input and Output plugins.
4+
5+
# Dependencies
6+
7+
* Apache Kafka version 0.8.1.1
8+
* jruby-kafka library
9+
10+
# Plugins
11+
12+
13+
## logstash-input-kafka
314

415
Apache Kafka input for Logstash. This input will consume messages from a Kafka topic using the high level consumer API exposed by Kafka.
516

617
For more information about Kafka, refer to this [documentation](http://kafka.apache.org/documentation.html)
718

819
Information about high level consumer API can be found [here](http://kafka.apache.org/documentation.html#highlevelconsumerapi)
920

10-
Logstash Configuration
11-
====================
21+
### Logstash Configuration
1222

1323
See http://kafka.apache.org/documentation.html#consumerconfigs for details about the Kafka consumer options.
1424

@@ -36,8 +46,52 @@ See http://kafka.apache.org/documentation.html#consumerconfigs for details about
3646

3747
The default codec is json
3848

39-
Dependencies
40-
====================
49+
## logstash-output-kafka
4150

42-
* Apache Kafka version 0.8.1.1
43-
* jruby-kafka library
51+
Apache Kafka output for Logstash. This output will produce messages to a Kafka topic using the producer API exposed by Kafka.
52+
53+
For more information about Kafka, refer to this [documentation](http://kafka.apache.org/documentation.html)
54+
55+
Information about producer API can be found [here](http://kafka.apache.org/documentation.html#apidesign)
56+
57+
### Logstash Configuration
58+
59+
See http://kafka.apache.org/documentation.html#producerconfigs for details about the Kafka producer options.
60+
61+
output {
62+
kafka {
63+
topic_id => ... # string (required), The topic to produce the messages to
64+
broker_list => ... # string (optional), default: "localhost:9092", This is for bootstrapping and the producer will only use it for getting metadata
65+
compression_codec => ... # string (optional), one of ["none", "gzip", "snappy"], default: "none"
66+
compressed_topics => ... # string (optional), default: "", This parameter allows you to set whether compression should be turned on for particular
67+
request_required_acks => ... # number (optional), one of [-1, 0, 1], default: 0, This value controls when a produce request is considered completed
68+
serializer_class => ... # string, (optional) default: "kafka.serializer.StringEncoder", The serializer class for messages. The default encoder takes a byte[] and returns the same byte[]
69+
partitioner_class => ... # string (optional) default: "kafka.producer.DefaultPartitioner"
70+
request_timeout_ms => ... # number (optional) default: 10000
71+
producer_type => ... # string (optional), one of ["sync", "async"] default => 'sync'
72+
key_serializer_class => ... # string (optional) default: kafka.serializer.StringEncoder
73+
message_send_max_retries => ... # number (optional) default: 3
74+
retry_backoff_ms => ... # number (optional) default: 100
75+
topic_metadata_refresh_interval_ms => ... # number (optional) default: 600 * 1000
76+
queue_buffering_max_ms => ... # number (optional) default: 5000
77+
queue_buffering_max_messages => ... # number (optional) default: 10000
78+
queue_enqueue_timeout_ms => ... # number (optional) default: -1
79+
batch_num_messages => ... # number (optional) default: 200
80+
send_buffer_bytes => ... # number (optional) default: 100 * 1024
81+
client_id => ... # string (optional) default: ""
82+
partition_key_format => ... # string (optional) default: nil, Provides a way to specify a partition key as a string
83+
}
84+
}
85+
86+
The default codec is json for outputs. If you select a codec of plain, logstash will encode your messages with not only the message
87+
but also with a timestamp and hostname. If you do not want anything but your message passing through, you should make
88+
the output configuration something like:
89+
90+
output {
91+
kafka {
92+
codec => plain {
93+
format => "%{message}"
94+
}
95+
topic_id => "my_topic_id"
96+
}
97+
}

NOTICE.TXT

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
Elasticsearch
2-
Copyright 2012-2015 Elasticsearch
2+
Copyright 2012-2019 Elastic NV
33

44
This product includes software developed by The Apache Software
55
Foundation (http://www.apache.org/).

README.md

+18-11
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Logstash Plugin
22

3-
[![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-input-kafka.svg)](https://travis-ci.org/logstash-plugins/logstash-input-kafka)
3+
[![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-integration-kafka.svg)](https://travis-ci.org/logstash-plugins/logstash-integration-kafka)
44

55
This is a plugin for [Logstash](https://github.com/elastic/logstash).
66

@@ -38,6 +38,7 @@ Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/log
3838
- Create a new plugin or clone and existing from the GitHub [logstash-plugins](https://github.com/logstash-plugins) organization. We also provide [example plugins](https://github.com/logstash-plugins?query=example).
3939

4040
- Install dependencies
41+
4142
```sh
4243
bundle install
4344
rake install_jars
@@ -52,19 +53,30 @@ bundle install
5253
rake install_jars
5354
```
5455

55-
- Run tests
56+
- Run unit tests
5657

5758
```sh
5859
bundle exec rspec
5960
```
6061

62+
- Run integration tests
63+
64+
you'll need to have docker available within your test environment before
65+
running the integration tests. The tests depend on a specific Kafka image
66+
found in Docker Hub called `spotify/kafka`. You will need internet connectivity
67+
to pull in this image if it does not already exist locally.
68+
69+
```sh
70+
bundle exec rspec --tag integration
71+
```
72+
6173
### 2. Running your unpublished Plugin in Logstash
6274

6375
#### 2.1 Run in a local Logstash clone
6476

6577
- Edit Logstash `Gemfile` and add the local plugin path, for example:
6678
```ruby
67-
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
79+
gem "logstash-output-kafka", :path => "/your/local/logstash-output-kafka"
6880
```
6981
- Install plugin
7082
```sh
@@ -77,7 +89,7 @@ bin/plugin install --no-verify
7789
```
7890
- Run Logstash with your plugin
7991
```sh
80-
bin/logstash -e 'filter {awesome {}}'
92+
bin/logstash -e 'output { kafka { topic_id => "kafka_topic" }}'
8193
```
8294
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
8395

@@ -87,16 +99,11 @@ You can use the same **2.1** method to run your plugin in an installed Logstash
8799

88100
- Build your plugin gem
89101
```sh
90-
gem build logstash-filter-awesome.gemspec
102+
gem build logstash-output-kafka.gemspec
91103
```
92104
- Install the plugin from the Logstash home
93105
```sh
94-
# Logstash 2.3 and higher
95-
bin/logstash-plugin install --no-verify
96-
97-
# Prior to Logstash 2.3
98-
bin/plugin install --no-verify
99-
106+
bin/plugin install /your/local/plugin/logstash-output-kafka.gem
100107
```
101108
- Start Logstash and proceed to test the plugin
102109

Rakefile

-2
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,3 @@
1-
21
# encoding: utf-8
32
require "logstash/devutils/rake"
43
require "jars/installer"
@@ -17,4 +16,3 @@ task :clean do
1716
FileUtils.rm_rf(p)
1817
end
1918
end
20-

0 commit comments

Comments
 (0)