Description
Is your feature request related to a problem? Please describe.
type checking tooling doesn't do well in assigning the types of dictionaries when its values take on more than one type.
parameters = {"foo": "bar", "baz": 1} # mypy thinks this is dict[str, object]
_ = cursor.execute(query, parameters=parameters) # mypy complains about this
so the user has to manually input types
parameters: dict[str, str | int] = {"foo": "bar", "baz": 1} # mypy knows this is dict[str, str | int]
_ = cursor.execute(query, parameters=parameters) # mypy is ok with this
but the actual type for parameters
in this interface is defined by an alias, Value
(from cdb2.py
), and it allows lots of different types. Setting types every time for a complex interface isn't ergonomic. It'd make sense to be able to do something like
from comdb2.dbapi2 import Value
parameters: dict[str, Value] = {"foo": "bar", "baz": 1}
_ = cursor.execute(query, parameters=parameters)
but Value
isn't explicitly exported by any module, so mypy
(rightly) complains about using it. We need to explicitly export the type in order for mypy
to accept its usage. This makes the type part of the public-facing interface from this package, but I think that's desirable - I also think it's generally correct to publicly expose any types that are used to define public interfaces/functions.
Describe the solution you'd like
export the Value
type so that it can be used for type checking in client code.
Describe alternatives you've considered
We could try to make it so mypy
correctly interprets union-value dictionaries instead of assigning the general abstraction dict[str, object]
, but I don't think that kind of change is tractable? It feels like that's a deliberate simplification on the part of the tool.
We could also just bite the bullet and write out the full abstraction every time (my current solution), but this isn't ergonomic for the developer.
Additional context
N/A