Another Round Of Link Cleaning (As Best as Possible) #910
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Regarding #902
@RichardLitt I got a decent chunk of redirects updated and some sour links removed.
There is still a little awesome_bot picks up but I am running into some weird behaviors. For example, https://www.toptal.com/freelance/how-to-work-remotely-and-still-be-the-best redirects to https://www.toptal.com/developers/blog but the bot doesn't pick that up, and the link doesn't redirect if you copy and paste the url directly into the search bar, at least for me.
Anyway I think this is a decent clean up, but again awesome_bot is still a bit angry. Some of the links that can be updated give me back a 403 when I use my own link cleaner, so, I am assuming from my limited knowledge, that some links might need a simulated browser for checking cause some urls really don't like getting requests from "bots".
I also think the awesome_bot workflow should be updated to allow 403 status codes since despite the bot not having authorized access to a url, accessing it with a proper browser shows those urls work fine.
Lemme know if there is anything else I can do to help!
Log of Changes: