u-boot-brain/tools/binman/etype/blob.py
Simon Glass 8287ee852d binman: Move compression into the Entry base class
Compression is currently available only with blobs. However we want to
report the compression algorithm and uncompressed size for all entries,
so that other entry types can support compression. This will help with
the forthcoming 'list' feature which lists entries in the image.

Move the compression properties into the base class. Also fix up the docs
which had the wrong property name.

Signed-off-by: Simon Glass <sjg@chromium.org>
2019-07-24 12:54:08 -07:00

60 lines
2.2 KiB
Python

# SPDX-License-Identifier: GPL-2.0+
# Copyright (c) 2016 Google, Inc
# Written by Simon Glass <sjg@chromium.org>
#
# Entry-type module for blobs, which are binary objects read from files
#
from entry import Entry
import fdt_util
import state
import tools
class Entry_blob(Entry):
"""Entry containing an arbitrary binary blob
Note: This should not be used by itself. It is normally used as a parent
class by other entry types.
Properties / Entry arguments:
- filename: Filename of file to read into entry
- compress: Compression algorithm to use:
none: No compression
lz4: Use lz4 compression (via 'lz4' command-line utility)
This entry reads data from a file and places it in the entry. The
default filename is often specified specified by the subclass. See for
example the 'u_boot' entry which provides the filename 'u-boot.bin'.
If compression is enabled, an extra 'uncomp-size' property is written to
the node (if enabled with -u) which provides the uncompressed size of the
data.
"""
def __init__(self, section, etype, node):
Entry.__init__(self, section, etype, node)
self._filename = fdt_util.GetString(self._node, 'filename', self.etype)
self.compress = fdt_util.GetString(self._node, 'compress', 'none')
def ObtainContents(self):
self._filename = self.GetDefaultFilename()
self._pathname = tools.GetInputFilename(self._filename)
self.ReadBlobContents()
return True
def ReadBlobContents(self):
# We assume the data is small enough to fit into memory. If this
# is used for large filesystem image that might not be true.
# In that case, Image.BuildImage() could be adjusted to use a
# new Entry method which can read in chunks. Then we could copy
# the data in chunks and avoid reading it all at once. For now
# this seems like an unnecessary complication.
indata = tools.ReadFile(self._pathname)
if self.compress != 'none':
self.uncomp_size = len(indata)
data = tools.Compress(indata, self.compress)
self.SetContents(data)
return True
def GetDefaultFilename(self):
return self._filename