(Download the dbo.fndblog2 function here)
Update and word of caution: Be aware that the mentioned function is undocumented and subject to change in any release of SQL Server. You should only use that for investigational purposes and in case you are investigating any sever data issues on your server. The function will parse the logs and might cause (depending on the size of the log) a severe impact on performance while running. In various situations you will not be able to either use third party reader for the transaction log or use any integration functionality like using CDC or change tracking for knowing what happened behind the scenes . The only resort is in that case to parse and read the log. As the log structure is not documented officially, you will have to use the exposed methods like fn_dblog() to get the information of the current database. This function will accept two parameters, the starting and the ending LSN of the interval you want to check. This can be useful if your log is big and can be directly retrieved from the log itself. Beside that you can use the transaction time to filter the events. For making life easier, I wrapped the execution in a function called fn_dblog2(), accepting a startime and endtime as a parameter (NULLable to get the whole log information).
The result will bring back the aggregated information for DML operations in the passed time span in the following way:
You will see INSERTS (LOP_INSERT_ROWS), UPDATES (LOP_MODIFY_ROWS) and DELETES (LOP_DELETE_ROWS) there.
According to this you will see which DML operations happened when, and how many rows were affected by them.
If you want something more custom and also want to include the user name or context information, you might better want to use Change Tracking or if you want the data in particular, Change Data Capture.
If you are on a system which does not have one of the options mentioned above, you might want to look at the automatic trigger generation script I create once which can be read in this blog entry.