mirror of
git://git.yoctoproject.org/layerindex-web.git
synced 2025-07-19 12:49:01 +02:00
tasks: squash out CRs in task logs to avoid huge transfers
Reloading an existing update task page was taking an extremely long time to fetch down the task log and then pegging the client CPU such that the browser gave a warning. Digging into it, logs from a Clear Linux update task can be of the order of 500MB in size (with all of the line refreshing using CRs that happens during downloads), causing (a) the transfer to take a long time and (b) the JS code that updates the log text box to be extremely busy. If we're loading the entire log from scratch (as we are when we refresh the page) rather than just getting an update since the last poll, we don't need any of those line refreshes - so squash them out before returning the data. Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com>
This commit is contained in:
parent
4fd3e9c923
commit
29dd3afa66
|
@ -6,6 +6,7 @@
|
|||
|
||||
import os
|
||||
import sys
|
||||
import re
|
||||
from datetime import datetime
|
||||
from itertools import islice
|
||||
from pkg_resources import parse_version
|
||||
|
@ -1564,10 +1565,12 @@ def task_log_view(request, task_id):
|
|||
raise Http404
|
||||
try:
|
||||
f.seek(start)
|
||||
# We need to escape this or else things that look like tags in the output
|
||||
# will be interpreted as such by the browser
|
||||
datastr = f.read()
|
||||
origlen = len(datastr)
|
||||
# Squash out CRs *within* the string (CRs at the start preserved)
|
||||
datastr = re.sub(b'\n[^\n]+\r', b'\n', datastr)
|
||||
# We need to escape this or else things that look like tags in the output
|
||||
# will be interpreted as such by the browser
|
||||
data = escape(datastr)
|
||||
response = HttpResponse(data)
|
||||
try:
|
||||
|
|
Loading…
Reference in New Issue
Block a user