Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MINIFICPP-2294 Flow migration #1850

Draft
wants to merge 4 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 0 additions & 7 deletions PROCESSORS.md
Original file line number Diff line number Diff line change
Expand Up @@ -1251,10 +1251,8 @@ In the list below, the names of required properties appear in bold. Any other pr
| invokehttp-proxy-username | | | Username to set when authenticating against proxy |
| invokehttp-proxy-password | | | Password to set when authenticating against proxy<br/>**Sensitive Property: true** |
| Content-type | application/octet-stream | | The Content-Type to specify for when content is being transmitted through a PUT, POST or PATCH. In the case of an empty value after evaluating an expression language expression, Content-Type defaults to |
| send-message-body | true | true<br/>false | DEPRECATED. Only kept for backwards compatibility, no functionality is included. |
| Send Message Body | true | true<br/>false | If true, sends the HTTP message body on POST/PUT/PATCH requests (default). If false, suppresses the message body and content-type header for these requests. |
| Use Chunked Encoding | false | true<br/>false | When POST'ing, PUT'ing or PATCH'ing content set this property to true in order to not pass the 'Content-length' header and instead send 'Transfer-Encoding' with a value of 'chunked'. This will enable the data transfer mechanism which was introduced in HTTP 1.1 to pass data of unknown lengths in chunks. |
| Disable Peer Verification | false | true<br/>false | DEPRECATED. The value is ignored, peer and host verification are always performed when using SSL/TLS. |
| Put Response Body in Attribute | | | If set, the response body received back will be put into an attribute of the original FlowFile instead of a separate FlowFile. The attribute key to put to is determined by evaluating value of this property. |
| Always Output Response | false | true<br/>false | Will force a response FlowFile to be generated and routed to the 'Response' relationship regardless of what the server status code received is |
| Penalize on "No Retry" | false | true<br/>false | Enabling this property will penalize FlowFiles that are routed to the "No Retry" relationship. |
Expand Down Expand Up @@ -2061,12 +2059,7 @@ In the list below, the names of required properties appear in bold. Any other pr
| Queue Max Message | 1000 | | Maximum number of messages allowed on the producer queue |
| Compress Codec | none | none<br/>gzip<br/>snappy<br/>lz4<br/>zstd | compression codec to use for compressing message sets |
| Max Flow Segment Size | 0 B | | Maximum flow content payload segment size for the kafka record. 0 B means unlimited. |
| Security CA | | | DEPRECATED in favor of SSL Context Service. File or directory path to CA certificate(s) for verifying the broker's key |
| Security Cert | | | DEPRECATED in favor of SSL Context Service.Path to client's public key (PEM) used for authentication |
| Security Private Key | | | DEPRECATED in favor of SSL Context Service.Path to client's private key (PEM) used for authentication |
| Security Pass Phrase | | | DEPRECATED in favor of SSL Context Service.Private key passphrase<br/>**Sensitive Property: true** |
| Kafka Key | | | The key to use for the message. If not specified, the UUID of the flow file is used as the message key.<br/>**Supports Expression Language: true** |
| Message Key Field | | | DEPRECATED, does not work -- use Kafka Key instead |
| Debug contexts | | | A comma-separated list of debug contexts to enable.Including: generic, broker, topic, metadata, feature, queue, msg, protocol, cgrp, security, fetch, interceptor, plugin, consumer, admin, eos, all |
| Fail empty flow files | true | true<br/>false | Keep backwards compatibility with <=0.7.0 bug which caused flow files with empty content to not be published to Kafka and forwarded to failure. The old behavior is deprecated. Use connections to drop empty flow files! |

Expand Down
1 change: 0 additions & 1 deletion docker/test/integration/features/kafka.feature
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,6 @@ Feature: Sending data to using Kafka streaming platform using PublishKafka
And the Minifi logs contain the following message: "PublishKafka: client.id [client_no_42]" in less than 10 seconds
And the Minifi logs contain the following message: "PublishKafka: Message Key [unique_message_key_123]" in less than 10 seconds
And the Minifi logs contain the following message: "PublishKafka: DynamicProperty: [retry.backoff.ms] -> [150]" in less than 10 seconds
And the Minifi logs contain the following message: "The Message Key Field property is set. This property is DEPRECATED and has no effect; please use Kafka Key instead." in less than 10 seconds

Examples: Compression formats
| compress_codec |
Expand Down
2 changes: 1 addition & 1 deletion extensions/kafka/CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ include(Fetchlibrdkafka)

include(${CMAKE_SOURCE_DIR}/extensions/ExtensionHeader.txt)

file(GLOB SOURCES "*.cpp")
file(GLOB SOURCES "*.cpp" "migrators/*.cpp")

add_minifi_library(minifi-rdkafka-extensions SHARED ${SOURCES})

Expand Down
5 changes: 1 addition & 4 deletions extensions/kafka/KafkaProcessorBase.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -22,16 +22,13 @@

namespace org::apache::nifi::minifi::processors {

std::optional<utils::net::SslData> KafkaProcessorBase::getSslData(core::ProcessContext& context) const {
return utils::net::getSslData(context, SSLContextService, logger_);
}

void KafkaProcessorBase::setKafkaAuthenticationParameters(core::ProcessContext& context, gsl::not_null<rd_kafka_conf_t*> config) {
security_protocol_ = utils::parseEnumProperty<kafka::SecurityProtocolOption>(context, SecurityProtocol);
utils::setKafkaConfigurationField(*config, "security.protocol", std::string{magic_enum::enum_name(security_protocol_)});
logger_->log_debug("Kafka security.protocol [{}]", magic_enum::enum_name(security_protocol_));
if (security_protocol_ == kafka::SecurityProtocolOption::ssl || security_protocol_ == kafka::SecurityProtocolOption::sasl_ssl) {
if (auto ssl_data = getSslData(context)) {
if (auto ssl_data = utils::net::getSslData(context, SSLContextService, logger_)) {
if (ssl_data->ca_loc.empty() && ssl_data->cert_loc.empty() && ssl_data->key_loc.empty() && ssl_data->key_pw.empty()) {
logger_->log_warn("Security protocol is set to {}, but no valid security parameters are set in the properties or in the SSL Context Service.",
magic_enum::enum_name(security_protocol_));
Expand Down
1 change: 0 additions & 1 deletion extensions/kafka/KafkaProcessorBase.h
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,6 @@ class KafkaProcessorBase : public core::Processor {
~KafkaProcessorBase() override = default;

protected:
virtual std::optional<utils::net::SslData> getSslData(core::ProcessContext& context) const;
void setKafkaAuthenticationParameters(core::ProcessContext& context, gsl::not_null<rd_kafka_conf_t*> config);

kafka::SecurityProtocolOption security_protocol_{};
Expand Down
21 changes: 0 additions & 21 deletions extensions/kafka/PublishKafka.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -371,14 +371,6 @@ void PublishKafka::onSchedule(core::ProcessContext& context, core::ProcessSessio
conn_ = std::make_unique<KafkaConnection>(key_);
configureNewConnection(context);

if (const auto message_key_field = context.getProperty(MessageKeyField);
message_key_field && !message_key_field->empty()) {
logger_->log_error(
"The {} property is set. This property is DEPRECATED and has no "
"effect; please use Kafka Key instead.",
MessageKeyField.name);
}

logger_->log_debug("Successfully configured PublishKafka");
}

Expand Down Expand Up @@ -658,19 +650,6 @@ bool PublishKafka::createNewTopic(core::ProcessContext& context, const std::stri
return true;
}

std::optional<utils::net::SslData> PublishKafka::getSslData(core::ProcessContext& context) const {
if (auto result = KafkaProcessorBase::getSslData(context); result) { return result; }

utils::net::SslData ssl_data;
if (auto security_ca = context.getProperty(SecurityCA)) { ssl_data.ca_loc = *security_ca; }
if (auto security_cert = context.getProperty(SecurityCert)) { ssl_data.cert_loc = *security_cert; }
if (auto security_private_key = context.getProperty(SecurityPrivateKey)) { ssl_data.key_loc = *security_private_key; }
if (auto security_private_key_pass = context.getProperty(SecurityPrivateKeyPassWord)) {
ssl_data.key_pw = *security_private_key_pass;
}
return ssl_data;
}

void PublishKafka::onTrigger(core::ProcessContext& context, core::ProcessSession& session) {
// Check whether we have been interrupted
if (interrupted_) {
Expand Down
34 changes: 2 additions & 32 deletions extensions/kafka/PublishKafka.h
Original file line number Diff line number Diff line change
Expand Up @@ -169,42 +169,13 @@ class PublishKafka final : public KafkaProcessorBase {
.withPropertyType(core::StandardPropertyTypes::DATA_SIZE_TYPE)
.withDefaultValue("0 B")
.build();
EXTENSIONAPI static constexpr auto SecurityCA =
core::PropertyDefinitionBuilder<>::createProperty("Security CA")
.withDescription(
"DEPRECATED in favor of SSL Context Service. File or directory "
"path to CA certificate(s) for verifying the broker's key")
.build();
EXTENSIONAPI static constexpr auto SecurityCert =
core::PropertyDefinitionBuilder<>::createProperty("Security Cert")
.withDescription(
"DEPRECATED in favor of SSL Context Service.Path to client's "
"public key (PEM) used for authentication")
.build();
EXTENSIONAPI static constexpr auto SecurityPrivateKey =
core::PropertyDefinitionBuilder<>::createProperty("Security Private Key")
.withDescription(
"DEPRECATED in favor of SSL Context Service.Path to client's "
"private key (PEM) used for authentication")
.build();
EXTENSIONAPI static constexpr auto SecurityPrivateKeyPassWord =
core::PropertyDefinitionBuilder<>::createProperty("Security Pass Phrase")
.withDescription(
"DEPRECATED in favor of SSL Context Service.Private key "
"passphrase")
.isSensitive(true)
.build();
EXTENSIONAPI static constexpr auto KafkaKey =
core::PropertyDefinitionBuilder<>::createProperty("Kafka Key")
.withDescription(
"The key to use for the message. If not specified, the UUID of "
"the flow file is used as the message key.")
.supportsExpressionLanguage(true)
.build();
EXTENSIONAPI static constexpr auto MessageKeyField =
core::PropertyDefinitionBuilder<>::createProperty("Message Key Field")
.withDescription("DEPRECATED, does not work -- use Kafka Key instead")
.build();
EXTENSIONAPI static constexpr auto DebugContexts =
core::PropertyDefinitionBuilder<>::createProperty("Debug contexts")
.withDescription(
Expand All @@ -229,8 +200,8 @@ class PublishKafka final : public KafkaProcessorBase {
EXTENSIONAPI static constexpr auto Properties = utils::array_cat(KafkaProcessorBase::Properties,
std::to_array<core::PropertyReference>({SeedBrokers, Topic, DeliveryGuarantee, MaxMessageSize, RequestTimeOut,
MessageTimeOut, ClientName, BatchSize, TargetBatchPayloadSize, AttributeNameRegex, QueueBufferMaxTime,
QueueBufferMaxSize, QueueBufferMaxMessage, CompressCodec, MaxFlowSegSize, SecurityCA, SecurityCert,
SecurityPrivateKey, SecurityPrivateKeyPassWord, KafkaKey, MessageKeyField, DebugContexts, FailEmptyFlowFiles}));
QueueBufferMaxSize, QueueBufferMaxMessage, CompressCodec, MaxFlowSegSize, KafkaKey, DebugContexts,
FailEmptyFlowFiles}));

EXTENSIONAPI static constexpr auto Success = core::RelationshipDefinition{"success",
"Any FlowFile that is successfully sent to Kafka will be routed to "
Expand Down Expand Up @@ -267,7 +238,6 @@ class PublishKafka final : public KafkaProcessorBase {
bool configureNewConnection(core::ProcessContext& context);
bool createNewTopic(
core::ProcessContext& context, const std::string& topic_name, const std::shared_ptr<core::FlowFile>& flow_file) const;
std::optional<utils::net::SslData> getSslData(core::ProcessContext& context) const override;

private:
KafkaConnectionKey key_;
Expand Down
Loading