We run automated inventories using crawling software, data extraction tools and custom Perl scripts. We can also do manual inventories if automated tools can’t be used e.g. for intranets. The end result is usually a huge spreadsheet plus some analysis and recommendations.
The inventories can capture:
- URLs
- Titles and lengths
- Meta descriptions and keywords
- Page level in the site structure
- Page status e.g. 404s and load speeds
- Headings
- Word count
- Images, videos and other assets
- Links to and from the page
- Other identifiable bits of content and metadata e.g. page templates
- Analytics e.g. pageviews, entrance rates and bounce rates
- SEO data e.g. index checks, ranking data
- Social share counts
We can then analyse:
- Size of site
- Breadth and depth
- Distribution of content
- Duplicate content and metadata
- Popularity of content
- Potential problem areas
The inventory can form the basis for an expert audit of the site, identifying:
- Content to remove
- Underperforming areas
- Anything else appropriate to the project
