MLenard wrote:Hi there.
Do you have data to support the business case for eValid site analysis running after every deployment?
Dollars saved, time saved, website link fixed?
Very good question thanks for asking MLenard.
Everyone pretty much agrees that there is some negative cost (loss) that arises from broken or errant links on a web page, but never the years there have been no really complete quantitative assessments of how MUCH this kind of issue actually costs.
If you search on the web for "Cost of broken links" you find a lot of results, but few of them quantitative.
Qualitative effect is a different story.
There is general agreement that having "too many broken links" turns away customers, but nobody really knows how many it takes before you really lose customers.
Generally speaking bad links, if very few in number, are seen as simple errors: everyone forgives a dangling link now and then.
Guess: 1% of the links or less that you try fail in some way.
But if a site has a LOT of bad links, the tendency is to click away..and commercially this means that you have lost the customers' attention.
Guess: 5% of the links fail.
If the threshold is over 5%, again, a guess based on experience, you have a site that is "poorly maintained" and one that will annoy visitors (they will quite clicking and move to some other "more reliable" site).
So, sorry, no quantitative data on this.
Then, there is this old story that has been going around for years, that there is for sure this ratio, a 12:1 payoff.
Because everyone knows that "an ounce of prevention is worth a pound of cure".
-- eValid Support