Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do you have shared log files under Windows?

I have several different processes and I would like them to all log to the same file. These processes are running on a Windows 7 system. Some are python scripts and others are cmd batch files.

Under Unix you'd just have everybody open the file in append mode and write away. As long as each process wrote less than PIPE_BUF bytes in a single message, each write call would be guaranteed to not interleave with any other.

Is there a way to make this happen under Windows? The naive Unix-like approach fails because Windows doesn't like more than one process having a file open for writing at a time by default.

like image 639
Omnifarious Avatar asked Feb 18 '12 00:02

Omnifarious


People also ask

How do I view shared folder logs?

Go to Security Settings>Local Policies>Audit Object Access. Right click the Audit Object Access item and select properties. Then tick both "failure" and "success" boxes. Apply the enabled auditing events in the folder or file you want to log.

Where are shared files in Windows?

To find and access a shared folder or printer: Search for Network , and click to open it. Select Search Active Directory at the top of the window; you may need to first select the Network tab on the upper left. From the drop-down menu next to "Find:", select either Printers or Shared Folders.

How do you see what files are shared on my computer?

Open Computer Management and, on the left side of the window, browse "System Tools -> Shared Folders -> Shares." The central panel from Computer Management loads the complete list of all the folders and partitions that are shared by your Windows computer or device.


2 Answers

It is possible to have multiple batch processes safely write to a single log file. I know nothing about Python, but I imagine the concepts in this answer could be integrated with Python.

Windows allows at most one process to have a specific file open for write access at any point in time. This can be used to implement a file based lock mechanism that guarantees events are serialized across multiple processes. See https://stackoverflow.com/a/9048097/1012053 and http://www.dostips.com/forum/viewtopic.php?p=12454 for some examples.

Since all you are trying to do is write to a log, you can use the log file itself as the lock. The log operation is encapsulated in a subroutine that tries to open the log file in append mode. If the open fails, the routine loops back and tries again. Once the open is successful the log is written and then closed, and the routine returns to the caller. The routine executes whatever command is passed to it, and anything written to stdout within the routine is redirected to the log.

Here is a test batch script that creates 5 child processes that each write to the log file 20 times. The writes are safely interleaved.

@echo off
setlocal
if "%~1" neq "" goto :test

:: Initialize
set log="myLog.log"
2>nul del %log%
2>nul del "test*.marker"
set procCount=5
set testCount=10

:: Launch %procCount% processes that write to the same log
for /l %%n in (1 1 %procCount%) do start "" /b "%~f0" %%n

:wait for child processes to finish
2>nul dir /b "test*.marker" | find /c "test" | >nul findstr /x "%procCount%" || goto :wait

:: Verify log results
for /l %%n in (1 1 %procCount%) do (
  <nul set /p "=Proc %%n log count = "
  find /c "Proc %%n: " <%log%
)

:: Cleanup
del "test*.marker"
exit /b

==============================================================================
:: code below is the process that writes to the log file

:test
set instance=%1
for /l %%n in (1 1 %testCount%) do (
  call :log echo Proc %instance% says hello!
  call :log dir "%~f0"
)
echo done >"test%1.marker"
exit

:log command args...
2>nul (
  >>%log% (
    echo ***********************************************************
    echo Proc %instance%: %date% %time%
    %*
    (call ) %= This odd syntax guarantees the inner block ends with success  =%
            %= We only want to loop back and try again if redirection failed =%
  )
) || goto :log
exit /b

Here is the output that demonstrates that all 20 writes were successful for each process

Proc 1 log count = 20
Proc 2 log count = 20
Proc 3 log count = 20
Proc 4 log count = 20
Proc 5 log count = 20

You can open the resulting "myLog.log" file to see how the writes have been safely interleaved. But the output is too large to post here.

It is easy to demonstrate that simultaneous writes from multiple processes can fail by modifying the :log routine so that it does not retry upon failure.

:log command args...
>>%log% (
  echo ***********************************************************
  echo Proc %instance%: %date% %time%
  %*
)
exit /b

Here are some sample results after "breaking" the :log routine

The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
The process cannot access the file because it is being used by another process.
Proc 1 log count = 12
Proc 2 log count = 16
Proc 3 log count = 13
Proc 4 log count = 18
Proc 5 log count = 14
like image 106
dbenham Avatar answered Sep 29 '22 10:09

dbenham


You can give this Python module a try: http://pypi.python.org/pypi/ConcurrentLogHandler

It provides a drop-in replacement the RotatingFileHandler which allows multiple processes to concurrently log to a single file without dropping or clobbering log events.

I haven't used it, but I found out about it while reading up on a related bug (Issue 4749) in Python.

If you implement your own code to do it instead of using that module, make sure you read up on the bug!

You can use output redirection on Windows like you do in Bash. Pipe the output of the batch files to a Python script that logs through the ConcurrentLogHandler.

like image 30
Steven T. Snyder Avatar answered Sep 29 '22 12:09

Steven T. Snyder