We all know the benefits of having a website that doesn't have any 404 errors.
I always crawl my clients websites on a regular basis to check for 404 error and when I do find some, I usually clean the code by removing and trace of that link or just redirect the URL to a new location if I think that specific link has some external link pointing to it.
But, sometimes, I find myself on the following situation:
We do a re-design work and also decide to optimize the whole URL structure. I make an re-direct list of all important pages but after the jobs is done and the website has a new design and a new structure, I still get 404 error in Search Consol Console Crawl section.
It's pretty much impossible no to get some errors if you ask me, especially if we are talking about an old-style ecommerce website that had all kind of filters and tens of thousands product pages.
What should I do in this situation? Mark them as fixed from the Search Console and they'll disappear?
Do such errors in Search Console impact rankings in any way?
If mark it as fixed are they still going to pop up later if I don't really do anything to fix them (like redirect and stuff like that)?