Skip to content

7.zhengda.lu/incident 37187 #20065

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 7 commits into
base: 7.65.x
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 1 addition & 5 deletions .deps/image_digests.json
Original file line number Diff line number Diff line change
@@ -1,5 +1 @@
{
"linux-aarch64": "sha256:1a4a3ea10f1c2cafb3dc1fd9eda02826026e0e660f670f9c065b6b0782f4d901",
"linux-x86_64": "sha256:00483449e9a400fc6e5a8433cb063f0946e20e1feda5f70b910d1bd451b63936",
"windows-x86_64": "sha256:9f34e9b4e33bb55f8e25ef12d7ec6e7e9f6916691a0b849bb1ebfb65e153f2d9"
}
{}
2 changes: 1 addition & 1 deletion .deps/metadata.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
"sha256": "31b12fc41197bf7276278e2eff6b288ff53f922e96277aa9b8dfff37f272beea"
"sha256": "98dc19dad7c8e401411254fe89f23f0119229419d09af3ed0b54093a0370837b"
}
36 changes: 18 additions & 18 deletions .deps/resolved/linux-aarch64_3.12.txt

Large diffs are not rendered by default.

48 changes: 24 additions & 24 deletions .deps/resolved/linux-x86_64_3.12.txt

Large diffs are not rendered by default.

40 changes: 20 additions & 20 deletions .deps/resolved/macos-x86_64_3.12.txt

Large diffs are not rendered by default.

40 changes: 20 additions & 20 deletions .deps/resolved/windows-x86_64_3.12.txt

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: Build dependencies
name: Resolve Dependencies and Build Wheels

on:
workflow_dispatch:
Expand Down Expand Up @@ -236,8 +236,8 @@ jobs:
path: output

publish:
name: Publish artifacts
if: github.event_name == 'push' || (github.event_name == 'workflow_dispatch' && (github.ref == github.event.repository.default_branch || startsWith(github.ref, '7.')))
name: Publish artifacts and update lockfiles via PR
if: github.event_name == 'push' || (github.event_name == 'workflow_dispatch' && (github.ref_name == github.event.repository.default_branch || startsWith(github.ref_name, '7.')))
needs:
- build
- build-macos
Expand Down
2 changes: 1 addition & 1 deletion LICENSE-3rdparty.csv
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ binary,PyPI,Apache-2.0,Copyright 2018 Ofek Lev
binary,PyPI,MIT,Copyright 2018 Ofek Lev
boto3,PyPI,Apache-2.0,"Copyright 2013-2017 Amazon.com, Inc. or its affiliates. All Rights Reserved."
botocore,PyPI,Apache-2.0,"Copyright 2012-2022 Amazon.com, Inc. or its affiliates. All Rights Reserved."
cachetools,PyPI,MIT,Copyright (c) 2014-2025 Thomas Kemmer
cachetools,PyPI,MIT,Copyright (c) 2014-2024 Thomas Kemmer
check-postgres,"https://github.com/bucardo/",BSD-2-Clause,Copyright 2007 - 2023 Greg Sabino Mullane
clickhouse-cityhash,PyPI,MIT,"Copyright (c) 2011, Alexander Marshalov <[email protected]>"
clickhouse-driver,PyPI,MIT,Copyright (c) 2017 by Konstantin Lebedev.
Expand Down
18 changes: 9 additions & 9 deletions agent_requirements.in
Original file line number Diff line number Diff line change
@@ -1,23 +1,23 @@
aerospike==7.1.1; sys_platform != 'win32' and sys_platform != 'darwin'
aws-requests-auth==0.4.3
azure-identity==1.20.0
beautifulsoup4==4.13.3
azure-identity==1.19.0
beautifulsoup4==4.12.3
binary==1.0.1
boto3==1.36.26
botocore==1.36.26
cachetools==5.5.2
boto3==1.36.16
botocore==1.36.16
cachetools==5.5.1
clickhouse-cityhash==1.0.2.4
clickhouse-driver==0.2.9
cm-client==45.0.4
confluent-kafka==2.8.0
cryptography==44.0.1
cryptography==43.0.1
ddtrace==2.10.6
dnspython==2.7.0
foundationdb==6.3.24
hazelcast-python-client==5.5.0
in-toto==2.0.0
jellyfish==1.1.3
kubernetes==32.0.1
kubernetes==32.0.0
lazy-loader==0.4
ldap3==2.9.1
lxml==5.1.1
Expand All @@ -41,7 +41,7 @@ pymongo[srv]==4.8.0; python_version >= '3.9'
pymqi==1.12.11; sys_platform != 'darwin' or platform_machine != 'arm64'
pymysql==1.1.1
pyodbc==5.2.0; sys_platform != 'darwin' or platform_machine != 'arm64'
pyopenssl==24.3.0
pyopenssl==24.2.1
pysmi==1.2.1
pysnmp-mibs==0.1.6
pysnmp==5.1.0
Expand All @@ -63,7 +63,7 @@ rethinkdb==2.4.10.post1
securesystemslib[crypto,pynacl]==0.28.0
semver==3.0.4
service-identity[idna]==24.2.0
simplejson==3.20.1
simplejson==3.19.3
snowflake-connector-python==3.13.2
supervisor==4.2.5
tuf==4.0.0
Expand Down
2 changes: 1 addition & 1 deletion amazon_msk/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ license = "BSD-3-Clause"

[project.optional-dependencies]
deps = [
"boto3==1.36.26",
"boto3==1.36.16",
]

[project.urls]
Expand Down
2 changes: 1 addition & 1 deletion cisco_aci/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ license = "BSD-3-Clause"

[project.optional-dependencies]
deps = [
"cryptography==44.0.1",
"cryptography==43.0.1",
]

[project.urls]
Expand Down
10 changes: 7 additions & 3 deletions datadog_checks_base/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,11 +32,15 @@ pip install datadog-checks-base
## Performance Optimizations

We strive to balance lean resource usage with a "batteries included" user experience.
We employ a few tricks to achieve this.
This is why we import some of our dependencies inside functions that use them instead of the more conventional import section at the top of the file.

One of them is the [lazy-loader][9] library that allows us to expose a nice API (simple, short imports) without the baseline memory overhead of importing everything all the time.
Below are some examples for how much we shave off the Python heap for a given dependency:

Another trick is to import some of our dependencies inside functions that use them instead of the more conventional import section at the top of the file. We rely on this the most in the `AgentCheck` base class.
- `requests==2.32.3`: 3.6MB
- `RequestWrapper` class (`datadog_checks_base==37.7.0`): 2.9MB
- `prometheus-client==0.21.1`: around 1MB

This translates into even bigger savings when we run in the Agent, something close to 50MB.

## Troubleshooting

Expand Down
83 changes: 29 additions & 54 deletions datadog_checks_base/datadog_checks/base/checks/base.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
# (C) Datadog, Inc. 2018-present
# All rights reserved
# Licensed under a 3-clause BSD style license (see LICENSE)
from __future__ import annotations

import copy
import functools
import importlib
import inspect
import logging
import os
import re
import traceback
import unicodedata
from collections import deque
from os.path import basename
from typing import ( # noqa: F401
Expand All @@ -26,10 +27,10 @@
Union,
)

import lazy_loader
import yaml
from pydantic import BaseModel, ValidationError

from datadog_checks.base.agent import AGENT_RUNNING, aggregator, datadog_agent
from datadog_checks.base.utils.format import json

from ..config import is_affirmative
from ..constants import ServiceCheck
Expand All @@ -45,8 +46,14 @@
)
from ..utils.agent.utils import should_profile_memory
from ..utils.common import ensure_bytes, to_native_string
from ..utils.diagnose import Diagnosis
from ..utils.fips import enable_fips
from ..utils.limiter import Limiter
from ..utils.metadata import MetadataManager
from ..utils.secrets import SecretsSanitizer
from ..utils.serialization import from_json, to_json
from ..utils.tagging import GENERIC_TAGS
from ..utils.tls import TlsContextWrapper
from ..utils.tracing import traced_class

if AGENT_RUNNING:
Expand Down Expand Up @@ -79,18 +86,7 @@
prof.start()

if TYPE_CHECKING:
import inspect as _module_inspect
import ssl # noqa: F401
import traceback as _module_traceback
import unicodedata as _module_unicodedata

from datadog_checks.base.utils.diagnose import Diagnosis
from datadog_checks.base.utils.http import RequestsWrapper
from datadog_checks.base.utils.metadata import MetadataManager

inspect: _module_inspect = lazy_loader.load('inspect')
traceback: _module_traceback = lazy_loader.load('traceback')
unicodedata: _module_unicodedata = lazy_loader.load('unicodedata')

# Metric types for which it's only useful to submit once per set of tags
ONE_PER_CONTEXT_METRIC_TYPES = [aggregator.GAUGE, aggregator.RATE, aggregator.MONOTONIC_COUNT]
Expand Down Expand Up @@ -345,9 +341,6 @@ def _get_metric_limiter(self, name, instance=None):
limit = self._get_metric_limit(instance=instance)

if limit > 0:
# See Performance Optimizations in this package's README.md.
from datadog_checks.base.utils.limiter import Limiter

return Limiter(name, 'metrics', limit, self.warning)

return None
Expand Down Expand Up @@ -392,22 +385,20 @@ def load_config(yaml_str):
"""
Convenience wrapper to ease programmatic use of this class from the C API.
"""
# See Performance Optimizations in this package's README.md.
import yaml

return yaml.safe_load(yaml_str)

@property
def http(self) -> RequestsWrapper:
def http(self):
# type: () -> RequestsWrapper
"""
Provides logic to yield consistent network behavior based on user configuration.

Only new checks or checks on Agent 6.13+ can and should use this for HTTP requests.
"""
if not hasattr(self, '_http'):
# See Performance Optimizations in this package's README.md.
from datadog_checks.base.utils.http import RequestsWrapper
# See Performance Optimizations in this package's README.md.
from ..utils.http import RequestsWrapper

if not hasattr(self, '_http'):
self._http = RequestsWrapper(self.instance or {}, self.init_config, self.HTTP_CONFIG_REMAPPER, self.log)

return self._http
Expand Down Expand Up @@ -443,14 +434,12 @@ def formatted_tags(self):
return self.__formatted_tags

@property
def diagnosis(self) -> Diagnosis:
def diagnosis(self):
# type: () -> Diagnosis
"""
A Diagnosis object to register explicit diagnostics and record diagnoses.
"""
if not hasattr(self, '_diagnosis'):
# See Performance Optimizations in this package's README.md.
from datadog_checks.base.utils.diagnose import Diagnosis

self._diagnosis = Diagnosis(sanitize=self.sanitize)
return self._diagnosis

Expand All @@ -464,9 +453,6 @@ def get_tls_context(self, refresh=False, overrides=None):
Since: Agent 7.24
"""
if not hasattr(self, '_tls_context_wrapper'):
# See Performance Optimizations in this package's README.md.
from datadog_checks.base.utils.tls import TlsContextWrapper

self._tls_context_wrapper = TlsContextWrapper(
self.instance or {}, self.TLS_CONFIG_REMAPPER, overrides=overrides
)
Expand All @@ -477,17 +463,15 @@ def get_tls_context(self, refresh=False, overrides=None):
return self._tls_context_wrapper.tls_context

@property
def metadata_manager(self) -> MetadataManager:
def metadata_manager(self):
# type: () -> MetadataManager
"""
Used for sending metadata via Go bindings.
"""
if not hasattr(self, '_metadata_manager'):
if not self.check_id and AGENT_RUNNING:
raise RuntimeError('Attribute `check_id` must be set')

# See Performance Optimizations in this package's README.md.
from datadog_checks.base.utils.metadata import MetadataManager

self._metadata_manager = MetadataManager(self.name, self.check_id, self.log, self.METADATA_TRANSFORMERS)

return self._metadata_manager
Expand Down Expand Up @@ -516,9 +500,8 @@ def in_developer_mode(self):
return False

def log_typos_in_options(self, user_config, models_config, level):
# See Performance Optimizations in this package's README.md.
# only import it when running in python 3
from jellyfish import jaro_winkler_similarity
from pydantic import BaseModel

user_configs = user_config or {} # type: Dict[str, Any]
models_config = models_config or {}
Expand Down Expand Up @@ -589,8 +572,6 @@ def load_configuration_model(import_path, model_name, config, context):

model = getattr(package, model_name, None)
if model is not None:
from pydantic import ValidationError

try:
config_model = model.model_validate(config, context=context)
except ValidationError as e:
Expand Down Expand Up @@ -619,14 +600,12 @@ def load_configuration_model(import_path, model_name, config, context):
def _get_config_model_context(self, config):
return {'logger': self.log, 'warning': self.warning, 'configured_fields': frozenset(config)}

def register_secret(self, secret: str) -> None:
def register_secret(self, secret):
# type: (str) -> None
"""
Register a secret to be scrubbed by `.sanitize()`.
"""
if not hasattr(self, '_sanitizer'):
# See Performance Optimizations in this package's README.md.
from datadog_checks.base.utils.secrets import SecretsSanitizer

# Configure lazily so that checks that don't use sanitization aren't affected.
self._sanitizer = SecretsSanitizer()
self.log.setup_sanitization(sanitize=self.sanitize)
Expand Down Expand Up @@ -1032,15 +1011,15 @@ def send_log(self, data, cursor=None, stream='default'):
# convert seconds to milliseconds
attributes['timestamp'] = int(timestamp * 1000)

datadog_agent.send_log(json.encode(attributes), self.check_id)
datadog_agent.send_log(to_json(attributes), self.check_id)
if cursor is not None:
self.write_persistent_cache('log_cursor_{}'.format(stream), json.encode(cursor))
self.write_persistent_cache('log_cursor_{}'.format(stream), to_json(cursor))

def get_log_cursor(self, stream='default'):
# type: (str) -> dict[str, Any] | None
"""Returns the most recent log cursor from disk."""
data = self.read_persistent_cache('log_cursor_{}'.format(stream))
return json.decode(data) if data else None
return from_json(data) if data else None

def _log_deprecation(self, deprecation_key, *args):
# type: (str, *str) -> None
Expand Down Expand Up @@ -1211,11 +1190,7 @@ def get_diagnoses(self):
The agent calls this method to retrieve diagnostics from integrations. This method
runs explicit diagnostics if available.
"""
return json.encode([d._asdict() for d in (self.diagnosis.diagnoses + self.diagnosis.run_explicit())])

def _clear_diagnosis(self) -> None:
if hasattr(self, '_diagnosis'):
self._diagnosis.clear()
return to_json([d._asdict() for d in (self.diagnosis.diagnoses + self.diagnosis.run_explicit())])

def _get_requests_proxy(self):
# type: () -> ProxySettings
Expand Down Expand Up @@ -1299,7 +1274,7 @@ def cancel(self):
def run(self):
# type: () -> str
try:
self._clear_diagnosis()
self.diagnosis.clear()
# Ignore check initializations if running in a separate process
if is_affirmative(self.instance.get('process_isolation', self.init_config.get('process_isolation', False))):
from ..utils.replay.execute import run_with_isolation
Expand Down Expand Up @@ -1335,7 +1310,7 @@ def run(self):
except Exception as e:
message = self.sanitize(str(e))
tb = self.sanitize(traceback.format_exc())
error_report = json.encode([{'message': message, 'traceback': tb}])
error_report = to_json([{'message': message, 'traceback': tb}])
finally:
if self.metric_limiter:
if is_affirmative(self.debug_metrics.get('metric_contexts', False)):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,6 @@

from itertools import tee

from prometheus_client.metrics_core import Metric
from prometheus_client.parser import _parse_sample, _replace_help_escaping


def text_fd_to_metric_families(fd):
raw_lines, input_lines = tee(fd, 2)
Expand All @@ -32,6 +29,10 @@ def _parse_payload(fd):

Yields Metric's.
"""
# See Performance Optimizations in this package's README.md.
from prometheus_client.metrics_core import Metric
from prometheus_client.parser import _parse_sample, _replace_help_escaping

name = ''
documentation = ''
typ = 'untyped'
Expand Down
Loading
Loading