hop.io

class hop.io.Consumer(group_id, broker_addresses, topics, **kwargs)[source]

An event stream opened for reading one or more topics. Instances of this class should be obtained from Stream.open().

close()[source]

End all subscriptions and shut down.

mark_done(metadata)[source]

Mark a message as fully-processed.

Parameters

metadata – A Metadata instance containing broker-specific metadata.

read(metadata=False, autocommit=True, **kwargs)[source]

Read messages from a stream.

Parameters
  • metadata – Whether to receive message metadata alongside messages.

  • autocommit – Whether messages are automatically marked as handled via mark_done when the next message is yielded. Defaults to True.

  • batch_size – The number of messages to request from Kafka at a time. Lower numbers can give lower latency, while higher numbers will be more efficient, but may add latency.

  • batch_timeout – The period of time to wait to get a full batch of messages from Kafka. Similar to batch_size, lower numbers can reduce latency while higher numbers can be more efficient at the cost of greater latency. If specified, this argument should be a datetime.timedelta object.

class hop.io.Deserializer(value)

An enumeration.

class hop.io.Metadata(topic: str, partition: int, offset: int, timestamp: int, key: Union[str, bytes], headers: List[Tuple[str, bytes]], _raw: cimpl.Message)[source]

Broker-specific metadata that accompanies a consumed message.

class hop.io.Producer(broker_addresses, topic, **kwargs)[source]

An event stream opened for writing to a topic. Instances of this class should be obtained from Stream.open().

close()[source]

Wait for enqueued messages to be written and shut down.

write(message, headers=None, delivery_callback=<function raise_delivery_errors>)[source]

Write messages to a stream.

Parameters
  • message – The message to write.

  • headers – Any headers to attach to the message, either as a dictionary mapping strings to strings, or as a list of 2-tuples of strings.

  • delivery_callback – A callback which will be called when each message is either delivered or permenantly fails to be delivered.

class hop.io.Stream(auth=True, start_at=ConsumerStartPosition.LATEST, until_eos=False)[source]

Defines an event stream.

Sets up defaults used within the client so that when a stream connection is opened, it will use defaults specified here.

Parameters
  • auth – A bool or Auth instance. Defaults to loading from auth.load_auth if set to True. To disable authentication, set to False.

  • start_at – The message offset to start at in read mode. Defaults to LATEST.

  • until_eos – Whether to listen to new messages forever (False) or stop when EOS is received in read mode (True). Defaults to False.

open(url, mode='r', group_id=None, **kwargs)[source]

Opens a connection to an event stream.

Parameters
  • url – Sets the broker URL to connect to.

  • mode – Read (‘r’) or write (‘w’) from the stream.

  • group_id – The consumer group ID from which to read. Generated automatically if not specified.

Returns

An open connection to the client, either a Producer instance in write mode or a Consumer instance in read mode.

Raises

ValueError – If the mode is not set to read/write, if more than one topic is specified in write mode, or if more than one broker is specified

hop.io.list_topics(url: str, auth: Union[bool, hop.auth.Auth] = True)[source]

List the accessible topics on the Kafka broker referred to by url.

Parameters
  • url – The Kafka broker URL. Only one broker may be specified. Topics may be specified, in which case only topics in the intersection of the set specified by the URL and actually present on the broker will be returned. If a userinfo component is present in the URL and auth is True, it will be treated as a hint to the automatic auth lookup.

  • auth – A bool or Auth instance. Defaults to loading from auth.load_auth if set to True. To disable authentication, set to False. If a username is specified as part of url but auth is a Auth instance the url information will be ignored.

Returns

A dictionary mapping topic names to confluent_kafka.admin.TopicMetadata instances.

Raises

ValueError – If more than one broker is specified.