Be good for goodness sake! Get your pages to stick to the published standard as close as possible and that shows to SE and humans that you've taken the steps to make your site compliant.
Search engines don't view websites the way people do. Among other things, they inspect the underlying code looking for mistakes. Mistakes can mean your website is ranked lower than it potentially can be. That's because it is imperative to Google to show to it's user base the best quality search results. All other things like relevance and off-page factors being equal, it will come down to website performance. Which site has errors? Which site is slower? Which site has broken links? If any of these problems are found by Googlebot, your website can take a backseat in the SERPs behind your competitor!
If you want your painstakingly-crafted websites to look their best, run them thought a website validator. You want to clean up any errors before someone sees them. No they will not manifest an error message to your website's visitors, but they show up in other ways. Like broken pages, shattered layouts, and other obvious mistakes.
If you are simply looking for mismatched or incorrectly nested tags, you can use a TIDY program, but that will only help the well-formedness of the HTML code. It will not pick up mistakes in the syntax or semantics of your code. Validating your pages with a validator that supports the latest standard is the way to do that.