“We rely heavily on New Relic One to give us a snapshot view of the organization, how it’s running, how the infrastructure is running, when we need to scale, when we’re over-extended, and when we’re paying more than we should.”

— Tom Cooper, Chief Technical Officer and Co-Founder, Find.Jobs 

Challenge

The top-level domain .Jobs d.b.a. Find.jobs spans 30,000 websites serving job seekers and hiring companies. Before Find.Jobs deployed New Relic One, Google web crawlers weren’t finding jobs on pages other than the first page on a site. The company had no idea this was happening—costing it hundreds of thousands of dollars in lost revenue.

At the same time, Find.Jobs started receiving unexpectedly large bills—between $20,000 and $40,000 per month—for log data. The company was forced to choose between narrowing down what was being logged and saving money—an unacceptable situation for a data-driven business.

Solution

Using New Relic One and its logs capability, Find.Jobs instantly saw that Google bots were hitting the first page on a site and then not going to the subsequent pages of jobs. After the company addressed the problem, it generated 20% more revenue in the first month after the fix than any other month it ever had. Now the company doesn’t have to limit its logging because of unpredictable bills for logging data. With New Relic One, it can log everything it needs.

Use Cases

  • Cataloging and faceting log data on bot and human traffic to webs
  • Monitoring and testing ongoing health of websites
  • Optimizing cloud spend
  • Tracking the impact of changes on business metrics
  • Understanding customer needs and usage

“For us, every piece of data is important, which is why New Relic has a strategic impact on our business.”

— Tom Cooper, Chief Technical Officer and Co-Founder, Find.Jobs