Searched for this answer but can't find anything, hopefully is not a duplicate.

Where should the CPU load be ? For instance..

I am remotely selecting results from a custom production database with a criteria of around three minutes from a C# application.

Every time the select command is executed, the server PC that I am using CPU goes up to around 50%. But surely, the load should be on the database that I am connecting to?

Why would the C# application rocket to 50% until the data is retrieved for reading?

If any more info is needed please don't hesitate to ask!