Description
Hello, I'm currently a beginner with tokio_postgres so I may have missed something, but the documentation on what to do with the FromSql
trait (as opposed to implementing it, which has a derive macro and is fine) is a little unclear.
I am reading a very large amount of data from a postgres database where there are a smallish number of columns, some of which are json blobs which contain a large number of fields I don't need. If I was receiving this as a postbody into a webserver or etc. I would just use serde
, which would ignore those fields as I'm deserializing them, only keeping the ones I explicitly asked for, which would be great from a performance perspective.
The analogue here seems to be the FromSql
trait, but I'm having trouble using it. Instead, the query_raw
function (which seems to be the only way to stream results in) returns Row
objects, which have already split the incoming data into Jsonb objects, which I then have to manually re-parse, clone from, and pay the drop fee.
This just doesn't feel right, since the FromSql
trait is already "out there" and seems like I should be able to pass it in as a generic parameter, but I'm at a loss as to where to actually use it. I was not able to find any explicit mention of the trait in the documentation, except how to implement it (which is straightforward), and one note on one function indicating it could not be used there (but implied that it could be used other places, and this was the exception).
Am I missing something? Is there a missing feature, or am I holding it wrong?
For the record, this is how I'm reading data:
let params: Vec<String> = vec![];
let results_stream: RowStream = pg_conn.query_raw(READ_MCN_REPORTS, params.as_slice()).await?;
pin_mut!(results_stream);
let mut results: Vec<MyType> = Vec::new();
while let Some(result) = results_stream.next().await {
let row = result?;
let report: MyType = manually_convert(row);
results.push(report);
// etc.
}
Appreciate any assistance. Thank you!