Unanswered: varchar(255) vs varchar(50) performance
I would like to know whether there would be a performance improvement in the table if we change the column of varchar(255) to varchar(50).
I have a scenario where the column occupies only 50 characters but i have declared the datatype with varchar(255). one of my friend told me to alter the datatype to varchar(50) for performance. It also gave a better performance.
I could not figure out the exact reason behind that. Please help me in undersatnding why this difference happens in varchar.
I believe in varchar the unwanted bytes are truncated if unused so i think varchar(255) and varchar(50) has got nothing to do with performance
I have a scenario where the column occupies only 50 characters but i have declared the datatype with varchar(255).
What's your reason for doing that? If the data is to be the same in each case then I don't think there will be any performance impact. However, you could find that the data eventually gets larger than you wanted simply because you've allowed it to be larger. So ultimately there might be some impact on the volume of data and therefore the time taken to process.