Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using Sync Command Facing Timeout Issue & batch_size is not Working. #126

Open
kamrankhatti opened this issue Dec 8, 2016 · 6 comments
Open

Comments

@kamrankhatti
Copy link

kamrankhatti commented Dec 8, 2016

Hi -
When I run process command data migration works perfectly, but when I use sync command batch_size 10000 does not work (facing weird issue batch_size works for embedded tables but not for others) hence it causes TinyTds::Error: Adaptive Server connection timed out error in some tables which has bunch of records.

Also the migration process is very slow when we use sync as compare to process command, I think this same fix #38 should implement for sync as well. Need @hammady consideration over it.

Thanks
Kamran

@kamrankhatti kamrankhatti changed the title Using Sync Command Facing imeout Issue & batch_size is not Working. Using Sync Command Facing Timeout Issue & batch_size is not Working. Dec 8, 2016
@srizviOfficial
Copy link

@kamrankhatti Thanks for identifying this issue

@hammady @anlek

I am facing this issue, Process command & sync commands behaves differently.

I am able to write translation file which perfectly translate data from MSSQL SERVER to MongoDB via "process command"without much issue's there are some but we able to fix them.

But, we have requirement to deal with differential records (delta) as well. For that we are using Sync command to translate data but it has some issue which we need to fix. There are some timeout error other thing which IO observe is that its not migrating data in chunks instead its processing whole record at once which making issue's for us.

@hammady do you recommend any work around to work with or how I can deal this issue

@hammady
Copy link
Contributor

hammady commented Dec 8, 2016

If the process command was updated to work in batches after I merged the sync feature, I should be able to port this feature as well. @anlek please confirm.

@srizviOfficial
Copy link

srizviOfficial commented Dec 9, 2016

@hammady That will be great help man

@anlek
Copy link
Owner

anlek commented Dec 10, 2016

Hey guys,

I'm not sure, I think the sync command is not using the same commands internally to batch the data.
I think this is the part of the code that is not being batched.

@hammady Does that sound right to you?

@kamrankhatti
Copy link
Author

kamrankhatti commented Dec 14, 2016

One more thing I found in sync command it ignores all embedded tables, each time you run sync command it brings all data for embedded tables.

I have noticed it does not insert embedded table names in __mongify_sync_helper__ table thats why each time sync command brings all data for embedded tables.

@hammady had you tested sync for embedded tables?

@hammady
Copy link
Contributor

hammady commented Dec 14, 2016 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants