Yes, I was questioned in an exit poll outside my voting place after voting, about 10 years ago. They asked me who I voted for, and why. It was none of their business, so I lied to them for the fun of it. Later, I learned that statistical methods assume a certain amount of people are weird like me and will lie, and compensate for it.
There are about 293 Million people in the US right now. Most polls interview from 1,000 to 4,000 people. That means that the best odds of your being in any one poll are something like 1 in 73,250. In other words, for every person polled, there are 73,249 who are not. Worst case, the odds are about 1 in 293,000. If you count just households instead of people, the odds are a little better -- about 1 in 100,000 or so.
The statistical methods of figuring out how 293 Million people think from a sample of 1,000 are pretty much cast in stone and have been proved to work, within the statistical percentage of error that is always released with the poll. What's much more critical, and the cause for differing results from different polls, is how a question is asked. Some polls are taken and released with the intent of swaying public opinion, so their methods may be suspect. But, when it comes to internal polls, it's just dumb to try to bias the results -- why bother to take a poll when you know how it's going to come out? So, in political campaigns and corporate marketing, internal polls are closely guarded.
But, regardless of the methods, any poll by a recognized organization that results in more than a 60%/40% split, you can pretty much take to the bank. That means that at least 50% more people believe one way more than the other. A poll would have to be pretty bad to come up with those numbers through bias or error.