This is an attempt to illustrate an alternative method of validating device-tree schema, based on pylibfdt and Python-based schema files. Two tools are included: validate_dts.py - validates a .dts file based on schema files it finds in the kernel binding_to_py.py - create Python schema files from an individual .txt binding file Both tools are just at the proof-of-concept stage. See the instructions at the top of each for usage. This DT validator came out of work in Chrome OS and was proven successful in providing a simple validator which is extremely flexible (e.g. custom code is easily added to check phandle targets and other non-trivial rules) and easy to use. It provides simple error messages when it finds a problem, and works directly from the compiled .dtb file. It is also quite fast even in the face of large schemas and large DT files. Schema is handled by Python files in Documentation/devicetree/binding, mirroring the existing .txt files. The above tool makes an attempt to convert .txt to .py, but it is very basic at present. To try this you also need the kernel patch: RFC: Example schema files written in Python (or check README.kernel_validation Signed-off-by: Simon Glass <sjg@xxxxxxxxxxxx> --- .gitignore | 18 -- README.kernel_validate | 118 ++++++++ binding_to_py.py | 591 +++++++++++++++++++++++++++++++++++++ fdt.py | 432 +++++++++++++++++++++++++++ fdt_util.py | 151 ++++++++++ kschema.py | 535 ++++++++++++++++++++++++++++++++++ validate_dts.py | 643 +++++++++++++++++++++++++++++++++++++++++ 7 files changed, 2470 insertions(+), 18 deletions(-) delete mode 100644 .gitignore create mode 100644 README.kernel_validate create mode 100644 binding_to_py.py create mode 100644 fdt.py create mode 100644 fdt_util.py create mode 100644 kschema.py create mode 100755 validate_dts.py diff --git a/.gitignore b/.gitignore deleted file mode 100644 index ee4e489..0000000 --- a/.gitignore +++ /dev/null @@ -1,18 +0,0 @@ -*.o -*.d -*.a -*.patch -*.so -*~ -*.tab.[ch] -lex.yy.c -*.lex.c -/dtc -/fdtdump -/convert-dtsv0 -/version_gen.h -/fdtget -/fdtput -/fdtoverlay -/patches -/.pc diff --git a/README.kernel_validate b/README.kernel_validate new file mode 100644 index 0000000..9c0a871 --- /dev/null +++ b/README.kernel_validate @@ -0,0 +1,118 @@ +Kernel Device-Tree Validator based on Python and pylibfdt +========================================================= + +Background +---------- + +As part of the discussions at Linux Plumbers 2018 [1] I showed a few people a +device-tree validator which uses a Python-based schema. This was developed for +Chrome OS to deal with the validation problem there. + +From my someone limited understanding of YAML, JSON and validation in that +world [2] it seems to me that it was very difficult to get that technology to +validate the kernel DT files successfully: lots of complex tooling, regexs and +potentially a need to move to yaml for the source format. In addition it was not +clear to me that it would be possible to do the sort of deep validation that is +desirable with the kernel DT files, for example checking subnodes do not +conflict, handling phandles which link nodes in both directions. Some of this +has in fact been handled by mnodifying dtc, which seems like a better approach. +But it has its limits, without deep knowledge of the schema. + +So I put together this proof-of-concept to show another approach, to seek +comments and see what people think. + +Bad things: +- It's Python +- It requires a new input format for the schema +- There would be a lot of work needed to turn this into production code +- Many others that I suspect you are about to point out + +Good things: +- It is quite easy to write the schema in Python +- The validation rules are easy to write and understand, even complex ones +- Highly complex validation rules (e.g. covering multiple linked node) are + possible to write, offering a very powerful validation framework +- It is fast and works on the .dtb file directly +- It gives simple error messages when it finds a problem +- It might be possible to automatically convert .txt files to .py, at least as + a starting point (see binding_to_py.py below) + + +How to try it +------------- + +1. Build pylibfdt + + clone https://github.com/sglass68/dtc.git + and checkout branch 'kernel-validator' + + cd dtc + sudo apt-get install swig python3-dev + PYTHON=python2 make + +You can leave out the environemnt variable if you like, but the Makefile uses +a strange name for the _libfdt.so file in some cases with Python 3, so you have +to know what you are doing. + + +2. Obtain the kernel patch + + clone https://github.com/sglass68/linux.git + and checkout branch 'schema' + + +3. Run the validator + + KERNEL=/path/to/kernel + PYTHONPATH=pylibfdt python validate_dts.py -k \ + $KERNEL/arch/arm/boot/dts/zynq-zybo.dts -d + +You should see a lot of messages like: + + No schema for: xlnx,zynq-can-1.0 + +This is because only a small subset of the kernel schema has been convered to +Python so far. + + +4. Try changing something + +Open zynq-zybo.dts and change 'device_type' to 'device-type'. Then run the +validator again and you should see: + +arch/arm/boot/dts/zynq-zybo.dts: +/memory@0: Unexpected property 'device-type', valid list is (device_type, reg, #address-cells, #size-cells, interrupt-parent) +/memory@0: Required property 'device_type' missing + + +5. Try auto-generating the schema + +There is a very simple tool which attempts to convert from the kernel .txt +format for DT bindings, to Python. + + PYTHONPATH=pylibfdt python binding_to_py.py \ + Documentation/devicetree/bindings/iio/adc/xilinx-xadc.txt + +This should produce a file in the same directory as the binding file, called +xilinx-xadc.py + +The structure of the file is a hierarchical set of nodes and properties for a +particular set of compatible strings. + + +5. Take a look at the code + +validate_dts.py is the validator +binding_to_py.py is the .txt parser and .py emitter + +Both are pretty rough. I'm not looking for a code review :-) + + +[1] https://linuxplumbersconf.org/event/2/contributions/166/ +[2] https://elinux.org/Device_tree_future#Devicetree_Verification + + + +Simon Glass +sjg@xxxxxxxxxxxx +Easter 2019 diff --git a/binding_to_py.py b/binding_to_py.py new file mode 100644 index 0000000..23d9cfe --- /dev/null +++ b/binding_to_py.py @@ -0,0 +1,591 @@ +# Copyright 2019 Google LLC +# Written by Simon Glass <sjg@xxxxxxxxxxxx> + +# This program is free software; you can redistribute it and/or +# modify it under the terms of the GNU General Public License as +# published by the Free Software Foundation; either version 2 of the +# License, or (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +# General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307, USA + +"""Convert textual binding file to Python (poorly). + +This provides a way to create a Python schema file from an existing textual +binding file, i.e. a .txt file inside Documentation/devicetree/bindings. This +is not a fully comprehensive conversion by any means, just a proof of concept. + +THERE ARE NO TESTS, few comments and the code is pretty rough. Read at own risk. +This is written to show that it is possible, not as an example of how to solve +this problem. Experiments with a few markdown parsers were unsuccessful, so I +ended up with this :-) + +Run with -d to get a full exception trace. +""" + +from __future__ import print_function + +import argparse +from collections import namedtuple, OrderedDict +import os +import sys + +# States that we can be in +(S_NAME, # Name of binding + S_DESC, # Description of binding + S_END, # End of file + S_PROP, # Property (required or optional) + S_OPTIONS, # Property options + S_OPTION, # A single option of many + S_EXAMPLE, # Example(s) of how to use the binding + S_NODES, # List of sub-node + S_NODE, # Sub-node +)= range(9) + +# Names for each state +STATE_NAME = { + S_NAME: 'name', + S_DESC: 'desc', + S_PROP: 'prop', + S_OPTIONS: 'options', + S_OPTION: 'option', + S_EXAMPLE: 'example', + S_NODES: 'nodes', + S_NODE: 'node', +} + +# Items in the stack, so we can get back to a previous indent level +StackItem = namedtuple('StackItem', ['indent', 'state', 'node']) + +# Maps property names to classes +PROP_NAME_TO_CLASS = { + 'reg': 'PropReg', + 'compatible': False, + 'interrupts': 'PropInterrupts', + 'clocks': 'PropClocks', + '#address-cells': 'PropInt', + '#size-cells': 'PropInt', +} + +# Indent expected for child items in the .txt file +INDENT_DELTA = 8 + + +def ParseArgv(argv): + """Parse the available arguments. + + Invalid arguments or -h cause this function to print a message and exit. + + Args: + argv: List of string arguments (excluding program name / argv[0]) + + Returns: + argparse.Namespace object containing the attributes. + """ + parser = argparse.ArgumentParser(description=__doc__) + parser.add_argument('-d', '--debug', action='store_true', + help='Run in debug mode (full exception traceback)') + parser.add_argument('-k', '--kernel', action='store_true', + help='Search kernel bindings when compiling') + parser.add_argument('bindings', type=str, nargs='+', + help='Paths to the binding files to convert') + parser.add_argument('-r', '--raise-on-error', action='store_true', + help='Causes the converter to raise on the first ' + + 'error it finds. This is useful for debugging.') + return parser.parse_args(argv) + +def IndentStr(indent): + """Get spaces for the given indent level + + Args: + indent: Number of levels to indent + + Returns: + A string containing the spaces to indent that much + """ + return ' ' * 4 * indent + +class Property: + """Models a single property in the binding file""" + def __init__(self, name, required, desc): + self.name = name + self.required = required + self._desc = [desc] + self._options = [] + + def AddDesc(self, desc): + self._desc.append(desc) + + def AddOption(self, option): + self._options.append(option) + + def GetValue(self): + val = [] + for opt in self._options: + val.append(opt.value) + return val + + def GetDesc(self): + if self._desc and not self._desc[-1]: + self._desc = self._desc[:-1] + return self._desc + + def GetOptions(self): + return self._options + + def RemoveFinalBlankLine(self): + if self._desc and not self._desc[-1]: + self._desc = self._desc[:-1] + +class Node: + """Models a node in the binding file, containing subnodes and properties""" + def __init__(self, name, required, desc): + self.name = name + self.required = required + self._desc = [desc] if desc is not None else [] + self._props = OrderedDict() + self._prop_lines = [] + self._subnodes = OrderedDict() + + def AddProp(self, prop): + self._props[prop.name] = prop + + def AddSubnode(self, subnode): + self._subnodes[subnode.name] = subnode + + def GetProps(self): + return self._props.values() + + def GetSubnodes(self): + return self._subnodes.values() + + def GetDesc(self): + if self._desc and not self._desc[-1]: + self._desc = self._desc[:-1] + return self._desc + + def GetProp(self, name): + return self._props.get(name) + + def AddPropLine(self, line): + self._prop_lines.append(line) + + def AddDesc(self, desc): + self._desc.append(desc) + + def GetPropLines(self): + return self._prop_lines + +class Option: + """Models an option in the binding file, a possible value for a property""" + def __init__(self, value, desc): + self._raw_value = value + if isinstance(value, str) and value.startswith('"'): + value = value[1:-1] + else: + try: + value = int(value) + except ValueError: + value = None + self.value = value + self._desc = [desc] + + def AddDesc(self, desc): + self._desc.append(desc) + + +class BindingConverter(object): + """Converter for binding files + + Properties: + _raise_on_error: True if the validator should raise on the first error + (useful for debugging) + """ + def __init__(self, raise_on_error): + self._raise_on_error = raise_on_error + self._infd = None + self._line = None + self._stack = [] + self._state = None + self._binding_name = None + self._used_types = set() + + def PeekLine(self): + if self._line is None: + self._line = self._infd.readline() + rest = self._line.lstrip() + indent = 0 + chars = len(self._line) - len(rest) + for ch in self._line[:chars]: + if ch == '\t': + indent += 8 + else: + indent += 1 + self._indent = indent + return self._line.strip() + + def ConsumeLine(self): + self._line = None + + def GetLine(self): + line = self.PeekLine() + self.ConsumeLine() + return line + + def GetPara(self): + para = [] + while True: + line = self.GetLine() + if not line and para: + break + para.append(line) + return '\n'.join(para) + + def GetOption(self): + opt = [] + line = self.GetLine() + if line[0:2] != '* ': + self.Raise("Expected '* ' at start of option line '%s'" % line) + opt.append(line) + indent = self._indent + while True: + line = self.PeekLine() + if not line or self._indent <= indent: + break + opt.append(line) + self.ConsumeLine() + return '\n'.join(opt) + + def GetListItem(self, line, line_type): + if line[0:2] != '- ': + self.Raise("Expected '- ' at start of '%s' line '%s'" % + (line_type, line)) + pos = line.find(':') + if pos == -1: + self.Raise("Expected ':' at end %s name '%s'" % (line_type, line)) + return line[2:pos], line[pos + 2:] + + def Raise(self, msg): + print('State %d/%s: Error: %s' % (self._state, STATE_NAME[self._state], + msg), file=sys.stderr) + sys.exit(1) + + def PushState(self, indent, node): + self._stack.append(StackItem(indent, self._state, node)) + + def PopState(self, line): + if not self._stack: + self.Raise("Stack underflow at '%s'" % line) + item = self._stack.pop() + return item.indent, item.state, item.node + + def Process(self, infd, node): + """Process the input file and record the information in 'node' + + This function is basically an ad-hoc state machine which attempts to + deal with the binding file, which does not seem to be in a particularly + regular format. + + Args: + infd: Input file (a .txt file from Documentation/devicetree/binding) + node: Node object to put information into + + Raises: + Valueerror if something goes wrong + """ + self._infd = infd + + self._state = S_NAME + name = '' + base_indent = 0 + required = False # Property is mandatory (else optional) + range_start = None # Not in a range of options + pending_subnode = False + for linenum, line in enumerate(infd.read().splitlines()): + #print('State %d/%s: base_indent=%d, %s' % + #(self._state, STATE_NAME[self._state], base_indent, line)) + + rest = line.lstrip() + indent = 0 + chars = len(line) - len(rest) + for ch in line[:chars]: + if ch == '\t': + indent += 8 + else: + indent += 1 + + if line: + while indent < base_indent: + #print('pop indent=%d, base=%d' % (indent, base_indent)) + base_indent, self._state, node = self.PopState(line) + #print('State %d/%s: %s' % (self._state, + #STATE_NAME[self._state], line)) + #print('indent=%d, new base=%d' % (indent, base_indent)) + + tag = None + + if self._state in (S_DESC, S_NODE): + if indent >= base_indent and line and line[-1] == ':': + tag = rest[:-1] + + if tag: + self.PushState(base_indent, node) + if tag == 'Required properties': + required = True + self._state = S_PROP + elif tag == 'Optional properties': + required = False + self._state = S_PROP + elif tag in ['Example', 'Examples']: + self._state = S_EXAMPLE + elif tag == 'Required subnodes': + required = True + self._state = S_NODES + else: + #self.Raise("Unknonwn tag in '%s'" % line) + #self._state = S_PROP + tag = None + if tag: + base_indent = (indent + INDENT_DELTA) & 0xf8 + continue + elif 'child node' in rest: + pending_subnode = True + self._state = S_DESC + subnode = Node(None, required, rest) + node.AddSubnode(subnode) + node = subnode + continue + + if self._state == S_NAME: + if indent: + self.Warn("Expected name to unindented (indent=%d, line='%s')" + % (indent, line)) + self._binding_name = line + self._state = S_DESC + elif self._state == S_DESC: + if line: + node.AddDesc(rest) + elif self._state == S_PROP: + prop_name, desc = self.GetListItem(rest, 'prop') + self.PushState(base_indent, node) + self._state = S_OPTIONS + range_start = None + prop = Property(prop_name, required, desc) + node.AddProp(prop) + base_indent += 2 + elif self._state == S_OPTIONS: + if rest[0:2] == '* ': + pos = rest.find(':') + if pos == -1: + self.Raise("Expected ':' at end of option name '%s'" % + line) + opt_value = rest[2:pos] + if range_start is not None: + try: + range_end = int(opt_value) + except: + self.Raise("Expected int value (not '%s') for range" % + opt_value) + for val in range(range_start + 1, range_end - 1): + opt = Option(val, '') + prop.AddOption(opt) + + self.PushState(base_indent, node) + self._state = S_OPTION + opt = Option(opt_value, rest[pos:]) + prop.AddOption(opt) + base_indent = indent + 2 + + # Handle a range of integer values + elif rest == '...': + try: + range_start = int(opt_value) + except: + self.Raise("Expected int value (not '%s') for range" % + opt_value) + else: + if pending_blank_line: + prop.AddDesc('') + prop.AddDesc(rest) + pending_blank_line = False + elif self._state == S_OPTION: + if rest: + opt.AddDesc(rest) + else: + pending_blank_line = True + elif self._state == S_NODES: + node_name, desc = self.GetListItem(rest, 'node') + self.PushState(base_indent, node) + self._state = S_NODE + subnode = Node(node_name, required, desc) + node.AddSubnode(subnode) + node = subnode + base_indent += 2 + elif self._state == S_NODE: + node.AddDesc(rest) + + def GenerateNodeOutput(self, node, indent): + """Generate the output for a node, its properties and subnodes + + Output is added by calling node.AddPropLine() for each line. + + Args: + node: Node object to process + indent: Starting indent level (0 for none, 1 for one level, etc.) + """ + for prop in node.GetProps(): + need_prop_name_str = False + pattern_str = '' + class_name = PROP_NAME_TO_CLASS.get(prop.name) + if class_name is False: + continue + #not class_name and + if prop.GetOptions(): + value_list = [] + single_type = None # See if all values are the same type + for opt in prop.GetOptions(): + value_list.append(str(opt.value)) + this_type = type(opt.value) + if single_type is None: + single_type = this_type + elif single_type and single_type != this_type: + single_type = False + if single_type == str: + class_name = 'PropStringList' + pattern_str = "str_pattern='%s'," % '|'.join(value_list) + elif single_type == int: + class_name = 'PropIntList' + pattern_str = "valid_list='%s'," % '|'.join(value_list) + if class_name: + need_prop_name_str = True + elif not class_name: + class_name = 'PropBool' + need_prop_name_str = True + if class_name == 'PropInt': + need_prop_name_str = True + if not class_name: + class_name = 'PropDesc' + need_prop_name_str = True + prop_name_str = "'%s', " % prop.name if need_prop_name_str else '' + self._used_types.add(class_name) + req_str = 'required=True, ' if prop.required else '' + prop.RemoveFinalBlankLine() + node.AddPropLine("%s%s(%s%s%s" % + (IndentStr(indent), class_name, prop_name_str, + req_str, pattern_str)) + desc_lines = prop.GetDesc() + if desc_lines: + for num, line in enumerate(desc_lines): + desc = not num and 'desc=' or '' + desc = "%s%s'%s'" % (IndentStr(indent + 1), desc, line) + if num == len(desc_lines) - 1: + desc += '),' + node.AddPropLine(desc) + else: + node.AddPropLine("%sdesc='')," % (IndentStr(indent + 1))) + for subnode in node.GetSubnodes(): + self.GenerateNodeOutput(subnode, indent) + + def OutputNode(self, outfd, node, indent): + """Output a node to the output file + + Args: + outfd: Output file + node: Node object to output + indent: Starting indent level (0 for none, 1 for one level, etc.) + """ + compat = node.GetProp('compatible') + if compat: + compat_list = "', '".join(compat.GetValue()) + compat_str = "['%s']" % compat_list + else: + compat_str = 'None' + desc = '' + if node.GetDesc(): + desc = ("'\n%s'" % IndentStr(indent + 2)).join(node.GetDesc()) + desc = ", desc=\n%s'%s'" % (IndentStr(indent + 2), desc) + print("%sNodeDesc('%s', %s, False%s, elements=[" % + (IndentStr(indent), node.name, compat_str, desc), file=outfd) + for prop_line in node.GetPropLines(): + print('%s%s' % (IndentStr(indent + 1), prop_line), file=outfd) + for subnode in node.GetSubnodes(): + self.OutputNode(outfd, subnode, indent + 1) + print('%s]),' % IndentStr(indent + 1), file=outfd) + + def Output(self, outfd, node): + """Output a Python binding to the output file + + This generates a (hopefully valid) Python binding file based on the + .txt binding file that was read. + + Args: + outfd: Output file + node: Node object to output + """ print('# SPDX-License-Identifier: GPL-2.0+', file=outfd) + print('#', file=outfd) + print(file=outfd) + print('# %s' % self._binding_name, file=outfd) + print(file=outfd) + if self._used_types: + print('from kschema import %s' % + (', '.join(sorted(self._used_types))), file=outfd) + print(file=outfd) + print('schema = [', file=outfd) + self.OutputNode(outfd, node, 1) + print('%s]' % IndentStr(1), file=outfd) + + def Convert(self, fname): + """Convert a .txt binding file into a .py schema file + + The output write is written to the same directory as the input file, but + with a .py extension. + + Args: + fname: Full path of file to convert + """ + basename = os.path.split(fname)[1] + root = os.path.splitext(fname)[0] + + outfname = root + '.py' + leafname = os.path.splitext(basename)[0] + self._used_types = set(['NodeDesc']) + with open(fname) as infd: + with open(outfname, 'w') as outfd: + node = Node(leafname, False, None) + self.Process(infd, node) + self.GenerateNodeOutput(node, 0) + self.Output(outfd, node) + +def Main(argv=None): + """Main program + + This contains the main logic of this program. + + Args: + argv: Arguments to the program (excluding argv[0]); if None, uses + sys.argv + """ + if argv is None: + argv = sys.argv[1:] + args = ParseArgv(argv) + converter = BindingConverter(args.raise_on_error) + found_errors = False + try: + for fname in args.bindings: + converter.Convert(fname) + except Exception as e: + if args.debug: + raise + print('Failed: %s' % e, file=sys.stderr) + found_errors = True + if found_errors: + sys.exit(1) + +if __name__ == '__main__': + Main() diff --git a/fdt.py b/fdt.py new file mode 100644 index 0000000..ec2b5be --- /dev/null +++ b/fdt.py @@ -0,0 +1,432 @@ +# Copyright 2019 Google LLC +# Written by Simon Glass <sjg@xxxxxxxxxxxx> +# +# Taken from U-Boot v2017.07 (tools/dtoc) +# +# This program is free software; you can redistribute it and/or +# modify it under the terms of the GNU General Public License as +# published by the Free Software Foundation; either version 2 of the +# License, or (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +# General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307, USA + +"""The higher level FDT library for parsing and interfacing with a dtb. + +Note: This pre-dates pylibfdt support in libfdt itself - crbug.com/703748 +This implementation was originally designed to work with fdtget to read data +from the DT. Now that it is possible to read it as binary data, and now that +pylibfdt has classes for Node and Property, it should be possible to simplify +this implementation significantly. However, it would be even better to upstream +any useful parts of this file into pylibfdt, and use that directly. +""" + +from __future__ import print_function + +from collections import OrderedDict +import struct +import sys + +import libfdt + +import fdt_util + +# This deals with a device tree, presenting it as an assortment of Node and +# Prop objects, representing nodes and properties, respectively. This file +# contains the base classes and defines the high-level API. You can use +# FdtScan() as a convenience function to create and scan an Fdt. + +# This implementation uses a libfdt Python library to access the device tree, +# so it is fairly efficient. + +# A list of types we support +(TYPE_BYTE, TYPE_INT, TYPE_STRING, TYPE_BOOL, TYPE_INT64) = range(5) + + +def CheckErr(errnum, msg): + """Checks for a lib fdt error and prints it out if one exists. + + Args: + errnum: The error number returned by lib fdt + msg: The message to bundle with the error print + """ + if errnum: + raise ValueError('Error %d: %s: %s' % + (errnum, libfdt.fdt_strerror(errnum), msg)) + + +class Prop(object): + """A device tree property + + Properties: + fdt: Device tree object + name: Property name (as per the device tree) + value: Property value as a string of bytes, or a list of strings of + bytes + type: Value type + data: The string data + """ + def __init__(self, fdt, node, offset, name, data): + self.fdt = fdt + self.node = node + self._offset = offset + self.name = name + self.value = None + self.data = str(data) + if not data: + self.type = TYPE_BOOL + self.value = True + return + self.type, self.value = self.BytesToValue(data) + + def GetPhandle(self): + """Get a (single) phandle value from a property + + Gets the phandle value from a property and returns it as an integer + """ + return fdt_util.fdt32_to_cpu(self.value[:4]) + + def LookupPhandle(self): + """Look up a node by its phandle (treating this property as a phandle) + + Returns: + Node object, or None if not found + """ + return self.fdt.LookupPhandle(self.GetPhandle()) + + def BytesToValue(self, data): + """Converts a string of bytes into a type and value + + Args: + data: A string containing bytes + + Returns: + A tuple: + Type of data + Data, either a single element or a list of elements. Each + element is one of: + TYPE_STRING: string value from the property + TYPE_INT: a byte-swapped integer stored as a 4-byte string + TYPE_BYTE: a byte stored as a single-byte string + """ + if sys.version_info > (3, 0): + data = str(data, encoding='latin1') + else: + data = str(data) + size = len(data) + strings = data.split('\0') + is_string = True + count = len(strings) - 1 + if count > 0 and not strings[-1]: + for string in strings[:-1]: + if not string: + is_string = False + break + for ch in string: + if ch < ' ' or ch > '~': + is_string = False + break + else: + is_string = False + if is_string: + if count == 1: + return TYPE_STRING, strings[0] + else: + return TYPE_STRING, strings[:-1] + if size % 4: + if size == 1: + return TYPE_BYTE, data[0] + else: + return TYPE_BYTE, list(data) + val = [] + for i in range(0, size, 4): + val.append(data[i:i + 4]) + if size == 4: + return TYPE_INT, val[0] + else: + return TYPE_INT, val + + def GetEmpty(self, value_type): + """Get an empty / zero value of the given type + + Returns: + A single value of the given type + """ + if value_type == TYPE_BYTE: + return chr(0) + elif value_type == TYPE_INT: + return struct.pack('<I', 0) + elif value_type == TYPE_STRING: + return '' + else: + return True + + +class Node(object): + """A device tree node + + Properties: + offset: Integer offset in the device tree + name: Device tree node tname + path: Full path to node, along with the node name itself + fdt: Device tree object + subnodes: A list of subnodes for this node, each a Node object + props: A dict of properties for this node, each a Prop object. + Keyed by property name + """ + def __init__(self, fdt, parent, offset, name, path): + self.fdt = fdt + self.parent = parent + self.offset = offset + self.name = name + self.path = path + self.subnodes = OrderedDict() + self.props = OrderedDict() + + def FindNode(self, name): + """Find a node given its name + + Args: + name: Node name to look for + + Returns: + Node object if found, else None + """ + for subnode in self.subnodes.values(): + if subnode.name == name: + return subnode + return None + + def Offset(self): + """Returns the offset of a node, after checking the cache + + This should be used instead of self.offset directly, to ensure that + the cache does not contain invalid offsets. + """ + self.fdt.CheckCache() + return self.offset + + def Scan(self): + """Scan a node's properties and subnodes + + This fills in the props and subnodes properties, recursively + searching into subnodes so that the entire tree is built. + """ + self.props = self.fdt.GetProps(self) + phandle = self.props.get('phandle') + if phandle: + val = fdt_util.fdt32_to_cpu(phandle.value) + self.fdt.phandle_to_node[val] = self + + offset = libfdt.fdt_first_subnode(self.fdt.GetFdt(), self.Offset()) + while offset >= 0: + sep = '' if self.path[-1] == '/' else '/' + name = self.fdt.fdt_obj.get_name(offset) + path = self.path + sep + name + node = Node(self.fdt, self, offset, name, path) + self.subnodes[name] = node + + node.Scan() + offset = libfdt.fdt_next_subnode(self.fdt.GetFdt(), offset) + + def Refresh(self, my_offset): + """Fix up the offset for each node, recursively + + Note: This does not take account of property offsets - these will not + be updated. + """ + if self.offset != my_offset: + #print '%s: %d -> %d\n' % (self.path, self._offset, my_offset) + self.offset = my_offset + offset = libfdt.fdt_first_subnode(self.fdt.GetFdt(), self.offset) + for subnode in self.subnodes.values(): + subnode.Refresh(offset) + offset = libfdt.fdt_next_subnode(self.fdt.GetFdt(), offset) + + def DeleteProp(self, prop_name): + """Delete a property of a node + + The property is deleted and the offset cache is invalidated. + + Args: + prop_name: Name of the property to delete + + Raises: + ValueError if the property does not exist + """ + CheckErr(libfdt.fdt_delprop(self.fdt.GetFdt(), self.Offset(), prop_name), + "Node '%s': delete property: '%s'" % (self.path, prop_name)) + del self.props[prop_name] + self.fdt.Invalidate() + + +class Fdt(object): + """Provides simple access to a flat device tree blob using libfdts. + + Properties: + infile: The File to read the dtb from + _root: Root of device tree (a Node object) + """ + def __init__(self, infile): + self._root = None + self._cached_offsets = False + self.phandle_to_node = OrderedDict() + self._fdt = bytearray(infile.read()) + self.fdt_obj = libfdt.Fdt(self._fdt) + + def LookupPhandle(self, phandle): + """Look up a node by its phandle + + Args: + phandle: Phandle to look up (integer > 0) + + Returns: + Node object, or None if not found + """ + return self.phandle_to_node.get(phandle) + + def Scan(self): + """Scan a device tree, building up a tree of Node objects + + This fills in the self._root property + + Args: + root: Ignored + + TODO(sjg@xxxxxxxxxxxx): Implement the 'root' parameter + """ + self._root = self.Node(self, None, 0, '/', '/') + self._root.Scan() + + def GetRoot(self): + """Get the root Node of the device tree + + Returns: + The root Node object + """ + return self._root + + def GetNode(self, path): + """Look up a node from its path + + Args: + path: Path to look up, e.g. '/microcode/update@0' + + Returns: + Node object, or None if not found + """ + node = self._root + for part in path.split('/')[1:]: + node = node.FindNode(part) + if not node: + return None + return node + + def Flush(self, outfile): + """Flush device tree changes to the given file + + Args: + outfile: The file to write the device tree out to + """ + outfile.write(self._fdt) + + def Pack(self): + """Pack the device tree down to its minimum size + + When nodes and properties shrink or are deleted, wasted space can + build up in the device tree binary. + """ + CheckErr(libfdt.fdt_pack(self._fdt), 'pack') + fdt_len = libfdt.fdt_totalsize(self._fdt) + del self._fdt[fdt_len:] + + def GetFdt(self): + """Get the contents of the FDT + + Returns: + The FDT contents as a string of bytes + """ + return self._fdt + + + def GetProps(self, node): + """Get all properties from a node. + + Args: + node: A Node object to get the properties for. + + Returns: + A dictionary containing all the properties, indexed by node name. + The entries are Prop objects. + + Raises: + ValueError: if the node does not exist. + """ + props_dict = OrderedDict() + poffset = libfdt.fdt_first_property_offset(self._fdt, node.offset) + while poffset >= 0: + p = self.fdt_obj.get_property_by_offset(poffset) + if hasattr(p, 'value'): + prop = Prop(node.fdt, node, poffset, p.name, p.value) + else: + prop = Prop(node.fdt, node, poffset, p.name, p) + props_dict[prop.name] = prop + + poffset = libfdt.fdt_next_property_offset(self._fdt, poffset) + return props_dict + + def Invalidate(self): + """Mark our offset cache as invalid""" + self._cached_offsets = False + + def CheckCache(self): + """Refresh the offset cache if needed""" + if self._cached_offsets: + return + self.Refresh() + self._cached_offsets = True + + def Refresh(self): + """Refresh the offset cache""" + self._root.Refresh(0) + + def GetStructOffset(self, offset): + """Get the file offset of a given struct offset + + Args: + offset: Offset within the 'struct' region of the device tree + + Returns: + Position of @offset within the device tree binary + """ + return libfdt.fdt_off_dt_struct(self._fdt) + offset + + @classmethod + def Node(cls, fdt, parent, offset, name, path): + """Create a new node + + This is used by Fdt.Scan() to create a new node using the correct + class. + + Args: + fdt: Fdt object + parent: Parent node, or None if this is the root node + offset: Offset of node + name: Node name + path: Full path to node + """ + node = Node(fdt, parent, offset, name, path) + return node + +def FdtScan(fname): + """Returns a new Fdt object from the implementation we are using""" + with open(fname, 'rb') as fd: + dtb = Fdt(fd) + dtb.Scan() + return dtb diff --git a/fdt_util.py b/fdt_util.py new file mode 100644 index 0000000..16f2778 --- /dev/null +++ b/fdt_util.py @@ -0,0 +1,151 @@ +# Copyright 2019 Google LLC +# Written by Simon Glass <sjg@xxxxxxxxxxxx> +# +# Taken from U-Boot v2017.07 (tools/dtoc) +# +# This program is free software; you can redistribute it and/or +# modify it under the terms of the GNU General Public License as +# published by the Free Software Foundation; either version 2 of the +# License, or (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +# General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307, USA + +"""Utility functions for fdt.""" + +from __future__ import print_function + +import os +import struct +import subprocess +import sys +import tempfile + + +def fdt32_to_cpu(val): + """Convert a device tree cell to an integer + + Args: + val: Value to convert (4-character string representing the cell value) + + Returns: + A native-endian integer value + """ + if sys.version_info > (3, 0): + if isinstance(val, bytes): + val = val.decode('utf-8') + val = val.encode('raw_unicode_escape') + return struct.unpack('>I', val)[0] + +def RunCommand(args): + """Run a command with arguments + + Args: + args: Command (args[0]) and arguments (args[1:]) + """ + process = subprocess.Popen(args, stdout=subprocess.PIPE, + stderr=subprocess.PIPE) + process.communicate() + +def CompileDts(dts_input, search_paths): + """Compiles a single .dts file + + This runs the file through the C preprocessor and then compiles it to .dtb + format. + + Args: + dts_input: Input filename + search_paths: Paths to search for header files + + Returns: + Tuple: + Filename of resulting .dtb file + tempfile containing the .dtb file + """ + dtc_input = tempfile.NamedTemporaryFile(suffix='.dts', delete=False) + root, _ = os.path.splitext(dts_input) + args = ['-E', '-P', '-x', 'assembler-with-cpp', '-D__ASSEMBLY__'] + args += ['-Ulinux'] + for path in search_paths or []: + args.extend(['-I', path]) + args += ['-o', dtc_input.name, dts_input] + RunCommand(['cc'] + args) + + dtb_output = tempfile.NamedTemporaryFile(suffix='.dtb', delete=False) + args = ['-I', 'dts', '-o', dtb_output.name, '-O', 'dtb'] + args.append(dtc_input.name) + RunCommand(['dtc'] + args) + return dtb_output.name, dtb_output + + +def EnsureCompiled(fname, search_paths=None): + """Compile an fdt .dts source file into a .dtb binary blob if needed. + + Args: + fname: Filename (if .dts it will be compiled). It not it will be + left alone + search_paths: Paths to search for header files + + Returns: + Tuple: + Filename of resulting .dtb file + tempfile object to unlink after the caller is finished + """ + out = None + _, ext = os.path.splitext(fname) + if ext == '.dtb': + return fname, None + else: + dts_input = fname + result = CompileDts(dts_input, search_paths) + if out: + os.unlink(out.name) + return result + + +def CompileAll(fnames): + """Compile a selection of .dtsi files + + This inserts the Chrome OS header and then includes the files one by one to + ensure that error messages quote the correct file/line number. + + Args: + fnames: List of .dtsi files to compile + """ + out = tempfile.NamedTemporaryFile(suffix='.dts', delete=False) + out.write('/dts-v1/;\n') + out.write('/ { chromeos { family: family { }; models: models { };') + out.write('schema { target-dirs { }; }; }; };\n') + for fname in fnames: + out.write('/include/ "%s"\n' % fname) + out.close() + dts_input = out.name + result = CompileDts(dts_input) + if out: + os.unlink(out.name) + return result + + +def GetCompatibleList(node): + """Gets the list of compatible strings for a node + + Args: + node: Node object to check + + Returns: + List containing each string in the node's 'compatible' property + """ + compats = node.props.get('compatible') + if compats is None: + return None + if isinstance(compats.value, list): + compats = [c for c in compats.value] + else: + compats = [compats.value] + return compats diff --git a/kschema.py b/kschema.py new file mode 100644 index 0000000..fa87402 --- /dev/null +++ b/kschema.py @@ -0,0 +1,535 @@ +# Copyright 2019 Google LLC +# Written by Simon Glass <sjg@xxxxxxxxxxxx> +# +# +# This program is free software; you can redistribute it and/or +# modify it under the terms of the GNU General Public License as +# published by the Free Software Foundation; either version 2 of the +# License, or (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +# General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307, USA + +"""Schema elements used by the kernel + +This module provides schema elements that can be used to build up a schema for +validation of kernel device tree files. + +This classes here are just a starting point and most are just placeholders with +very little functionality. +""" + +from __future__ import print_function + +import re + +import fdt_util + + +def CheckPhandleTarget(val, target, target_compat): + """Check that the target of a phandle matches a pattern + + Args: + val: Validator (used for model list, etc.) + target: Target node (Node object) + target_compat: Match string. This is the compatible string that the + target must point to. + + Returns: + True if the target matches, False if not + """ + compats = fdt_util.GetCompatibleList(target) + if not compats: + return False + return target_compat in compats + + +class SchemaElement(object): + """A schema element, either a property or a subnode + + Args: + name: Name of schema eleent + prop_type: String describing this property type + required: True if this element is mandatory, False if optional + cond_props: Properties which control whether this element is present. + Dict: + key: name of controlling property + value: True if the property must be present, False if it must be absent + desc: Textual description of element for users + """ + def __init__(self, name, prop_type, required=False, cond_props=None, + desc=''): + self.name = name + self.prop_type = prop_type + self.required = required + self.cond_props = cond_props + self.desc = desc + self.parent = None + + def NameMatches(self, name): + return self.name == name + + def Validate(self, val, prop): + """Validate the schema element against the given property. + + This method is overridden by subclasses. It should call val.Fail() if there + is a problem during validation. + + Args: + val: CrosConfigValidator object + prop: Prop object of the property + """ + pass + + +class PropDesc(SchemaElement): + """A generic property schema element (base class for properties)""" + def __init__(self, name, prop_type, required=False, cond_props=None, + desc=''): + super(PropDesc, self).__init__(name, prop_type, required, cond_props, + desc) + + +class PropString(PropDesc): + """A string-property + + Args: + str_pattern: Regex to use to validate the string + """ + def __init__(self, name, required=False, str_pattern='', + cond_props=None): + super(PropString, self).__init__(name, 'string', required, + cond_props) + self.str_pattern = str_pattern + + def Validate(self, val, prop): + """Check the string with a regex""" + if not self.str_pattern: + return + pattern = '^' + self.str_pattern + '$' + m = re.match(pattern, prop.value) + if not m: + val.Fail(prop.node.path, "'%s' value '%s' does not match pattern '%s'" % + (prop.name, prop.value, pattern)) + + +class PropInt(PropDesc): + """An integer property""" + def __init__(self, name, required=False, int_range=None, + cond_props=None, desc=''): + super(PropInt, self).__init__(name, 'int', required, cond_props, desc) + self.int_range = int_range + + def Validate(self, val, prop): + """Check that the value is an int""" + try: + int_val = fdt_util.fdt32_to_cpu(prop.value) + if self.int_range is not None: + # pylint: disable=unpacking-non-sequence + min_val, max_val = self.int_range + if int_val < min_val or int_val > max_val: + val.Fail(prop.node.path, "'%s' value '%s' is out of range [%g..%g]" % + (prop.name, int_val, min_val, max_val)) + + except ValueError: + val.Fail(prop.node.path, "'%s' value '%s' is not an int" % + (prop.name, prop.value)) + + +class PropIntList(PropDesc): + """An int-list property schema element + + Note that the list may be empty in which case no validation is performed. + + Args: + int_range: List: min and max value + """ + def __init__(self, name, required=False, int_range=None, valid_list=None, + cond_props=None, desc=''): + super(PropIntList, self).__init__(name, 'intlist', required, cond_props, + desc) + self.int_range = int_range + self.valid_list = valid_list + + def Validate(self, val, prop): + """Check each item of the list with a range""" + if not self.int_range: + return + for int_val in prop.value: + try: + if self.int_range is not None: + # pylint: disable=unpacking-non-sequence + min_val, max_val = self.int_range + int_val = fdt_util.fdt32_to_cpu(int_val) + if int_val < min_val or int_val > max_val: + val.Fail(prop.node.path, + "'%s' value '%s' is out of range [%g..%g]" % + (prop.name, prop.value, min_val, max_val)) + if self.valid_list and int_val not in self.valid_list: + val.Fail(prop.node.path, + "'%s' value '%s' is not in valid list %s" % + (prop.name, prop.value, self.valid_list)) + + except ValueError: + val.Fail(prop.node.path, "'%s' value '%s' is not a float" % + (prop.name, prop.value)) + +class PropFloat(PropDesc): + """A floating-point property""" + def __init__(self, name, required=False, float_range=None, + cond_props=None): + super(PropFloat, self).__init__(name, 'float', required, cond_props) + self.float_range = float_range + + def Validate(self, val, prop): + """Check that the value is a float""" + try: + float_val = float(prop.value) + if self.float_range is not None: + # pylint: disable=unpacking-non-sequence + min_val, max_val = self.float_range + if float_val < min_val or float_val > max_val: + val.Fail(prop.node.path, "'%s' value '%s' is out of range [%g..%g]" % + (prop.name, prop.value, min_val, max_val)) + + except ValueError: + val.Fail(prop.node.path, "'%s' value '%s' is not a float" % + (prop.name, prop.value)) + + +class PropBool(PropDesc): + """A boolean property""" + def __init__(self, name, cond_props=None, desc=''): + super(PropBool, self).__init__(name, 'bool', False, + cond_props=cond_props, desc=desc) + + +class PropFile(PropDesc): + """A file property + + This represents a file to be installed on the filesystem. + + Properties: + target_dir: Target directory in the filesystem for files from this + property (e.g. '/etc/cras'). This is used to set the install directory + and keep it consistent across ebuilds (which use cros_config_host) and + init scripts (which use cros_config). The actual file written will be + relative to this. + """ + def __init__(self, name, required=False, str_pattern='', + cond_props=None, target_dir=None): + super(PropFile, self).__init__(name, 'file', required, cond_props) + self.str_pattern = str_pattern + self.target_dir = target_dir + + def Validate(self, val, prop): + """Check the filename with a regex""" + if not self.str_pattern: + return + pattern = '^' + self.str_pattern + '$' + m = re.match(pattern, prop.value) + if not m: + val.Fail(prop.node.path, "'%s' value '%s' does not match pattern '%s'" % + (prop.name, prop.value, pattern)) + + +class PropStringList(PropDesc): + """A string-list property schema element + + Note that the list may be empty in which case no validation is performed. + + Args: + str_pattern: Regex to use to validate the string + """ + def __init__(self, name, required=False, str_pattern='', + cond_props=None, desc=''): + super(PropStringList, self).__init__(name, 'stringlist', required, + cond_props, desc=desc) + self.str_pattern = str_pattern + + def Validate(self, val, prop): + """Check each item of the list with a regex""" + if not self.str_pattern: + return + pattern = '^' + self.str_pattern + '$' + value = prop.value if isinstance(prop.value, list) else [prop.value] + for item in value: + m = re.match(pattern, item) + if not m: + val.Fail(prop.node.path, "'%s' value '%s' does not match pattern '%s'" % + (prop.name, item, pattern)) + + +class PropPhandleTarget(PropDesc): + """A phandle-target property schema element + + A phandle target can be pointed to by another node using a phandle property. + """ + def __init__(self, required=False, cond_props=None): + super(PropPhandleTarget, self).__init__('phandle', 'phandle-target', + required, cond_props) + + +class PropPhandle(PropDesc): + """A phandle property schema element + + Phandle properties point to other nodes, and allow linking from one node to + another. + + Properties: + target_compat: String to use to validate the target of this phandle. + It is the compatible string that it must point to. See + CheckPhandleTarget for details. + """ + def __init__(self, name, target_compat, required=False, cond_props=None): + super(PropPhandle, self).__init__(name, 'phandle', required, cond_props) + self.target_compat = target_compat + + def Validate(self, val, prop): + """Check that this phandle points to the correct place""" + phandle = prop.GetPhandle() + target = prop.fdt.LookupPhandle(phandle) + if not CheckPhandleTarget(val, target, self.target_compat): + val.Fail(prop.node.path, "Phandle '%s' targets node '%s' which " + "does not have compatible string '%s'" % + (prop.name, target.path, self.target_compat)) + + +class PropReg(PropDesc): + """A 'reg' property + + This holds register addresses + """ + def __init__(self, required=False, cond_props=None, desc=''): + super(PropDesc, self).__init__('reg', 'reg', required, cond_props, desc) + + +class PropClocks(PropDesc): + """A 'clocks' property + + This holds information about clocks used by this node + """ + def __init__(self, required=False, cond_props=None): + super(PropDesc, self).__init__('clocks', 'clocks', required, cond_props) + + +class PropRegEx(PropDesc): + """A property with a name matching a given pattern + """ + def __init__(self, regex, required=False, cond_props=None): + super(PropRegEx, self).__init__(regex, 'regex') + self.regex = regex + + def NameMatches(self, name): + return re.match(self.regex, name) is not None + + +class PropSupply(PropRegEx): + """A regulator supply property + + This holds information about regulators used by this node + """ + def __init__(self, root_name, required=False, cond_props=None): + super(PropSupply, self).__init__('%s[0-9]-supply' % root_name, required, + cond_props) + + +class PropGpios(PropDesc): + """A GPIO-list property + + This holds a list of GPIOs used by this node + """ + def __init__(self, root_name, count=None, required=False, cond_props=None): + super(PropGpios, self).__init__('%s-gpios' % root_name, required, + cond_props) + self.count = count + + +class PropInterrupts(PropDesc): + """An interrupts property + + This holds a list of interrupts used by this node + """ + def __init__(self, count=None, required=False, cond_props=None, + desc=''): + super(PropInterrupts, self).__init__('interrupts', 'interrupts', + required, cond_props, desc=desc) + self.count = count + + +class PropClocks(PropDesc): + """An clocks property + + This holds a list of clocks used by this node + """ + def __init__(self, count=None, required=False, cond_props=None, + desc=''): + super(PropClocks, self).__init__('clocks', 'clocks', required, + cond_props, desc) + self.count = count + + +class PropCustom(PropDesc): + """A custom property with its own validator + + Properties: + validator: Function to call to validate this property + """ + def __init__(self, name, validator, required=False, cond_props=None): + super(PropCustom, self).__init__(name, 'custom', required, + cond_props) + self.validator = validator + + def Validate(self, val, prop): + """Validator for this property + + This should be a static method in CrosConfigValidator. + + Args: + val: CrosConfigValidator object + prop: Prop object of the property + """ + self.validator(val, prop) + + +class PropAny(PropDesc): + """A placeholder for any property name + + Properties: + validator: Function to call to validate this property + """ + def __init__(self, validator=None): + super(PropAny, self).__init__('ANY', 'any') + self.validator = validator + + def Validate(self, val, prop): + """Validator for this property + + This should be a static method in CrosConfigValidator. + + Args: + val: CrosConfigValidator object + prop: Prop object of the property + """ + if self.validator: + self.validator(val, prop) + + +class NodeDesc(SchemaElement): + """A generic node schema element (base class for nodes)""" + def __init__(self, name, compat, required=False, elements=None, + cond_props=None, desc=''): + super(NodeDesc, self).__init__(name, 'node', required, + cond_props) + self.compat = compat + self.elements = [] if elements is None else elements + if compat: + self.elements.append(PropStringList('compatible', True, + '|'.join(compat))) + self.elements.append(PropInt('#address-cells')) + self.elements.append(PropInt('#size-cells')) + self.elements.append(PropInt('interrupt-parent')) + for element in self.elements: + element.parent = self + + def GetNodes(self): + """Get a list of schema elements which are nodes + + Returns: + List of objects, each of which has NodeDesc as a base class + """ + return [n for n in self.elements if isinstance(n, NodeDesc)] + + +class NodeModel(NodeDesc): + """A model (top-level node in DT)""" + def __init__(self, name, compat, elements=None): + super(NodeModel, self).__init__('MODEL', compat, elements=elements) + self.name = name + self.elements.append(PropString('model', True, name)) + self.elements.append(NodeAliases()) + self.elements.append(NodeReservedMemory()) + self.elements.append(NodeThermalZones()) + self.elements.append(NodeMemory()) + self.elements.append(NodeChosen()) + + +class NodeAliases(NodeDesc): + """An /aliases node, containing references to other nodes""" + def __init__(self): + super(NodeAliases, self).__init__('ALIAS', None) + self.name = 'aliases' + self.elements.append(PropAny()) + + +class NodeByPath(NodeDesc): + """A nde which is specified by path rather than compatible string""" + def __init__(self, path, elements): + super(NodeByPath, self).__init__('PATH-%s' % path, None, + elements=elements) + self.path = path + + +class NodeCpus(NodeByPath): + """A /cpus node, containing information about CPUs""" + def __init__(self, elements=None): + super(NodeCpus, self).__init__('/cpus', elements) + self.name = 'cpus' + + +class NodeCpu(NodeDesc): + """A cpu node, containing information about a CPU""" + def __init__(self, compat, elements=None): + super(NodeCpu, self).__init__('CPU', compat, elements=elements) + self.name = 'cpu' + + +class NodeReservedMemory(NodeByPath): + """A //reserved-memory node, containing information about reserved memory""" + def __init__(self, elements=None): + super(NodeReservedMemory, self).__init__('/reserved-memory', elements) + self.name = 'reserved-memory' + + +class NodeThermalZones(NodeByPath): + """A /cpus node, containing information about thermals""" + def __init__(self, elements=None): + super(NodeThermalZones, self).__init__('/thermal-zones', elements) + self.name = 'thermal-zones' + + +class NodeMemory(NodeByPath): + """A /memory node, containing information about memory areas""" + def __init__(self, elements=None): + super(NodeMemory, self).__init__('/memory', elements) + self.name = 'memory' + + +class NodeChosen(NodeByPath): + """A /chosen node, containing information about chosen options""" + def __init__(self, elements=None): + super(NodeChosen, self).__init__('/chosen', elements) + self.name = 'chosen' + + +class NodeAny(NodeDesc): + """A node schema element that matches a name pattern""" + def __init__(self, name_pattern, elements): + super(NodeAny, self).__init__('ANY', None, elements=elements) + self.name_pattern = name_pattern + + def Validate(self, val, node): + """Check the name with a regex""" + if not self.name_pattern: + return + pattern = '^' + self.name_pattern + '$' + m = re.match(pattern, node.name) + if not m: + val.Fail(node.path, "Node name '%s' does not match pattern '%s'" % + (node.name, pattern)) diff --git a/validate_dts.py b/validate_dts.py new file mode 100755 index 0000000..835e883 --- /dev/null +++ b/validate_dts.py @@ -0,0 +1,643 @@ +#!/usr/bin/env python2 +# Copyright 2019 Google LLC +# Written by Simon Glass <sjg@xxxxxxxxxxxx> +# +# This program is free software; you can redistribute it and/or +# modify it under the terms of the GNU General Public License as +# published by the Free Software Foundation; either version 2 of the +# License, or (at your option) any later version. +# +# This program is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +# General Public License for more details. +# +# You should have received a copy of the GNU General Public License +# along with this program; if not, write to the Free Software +# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307, USA + +"""Validates a device-tree file + +This enforces various rules defined by the schema. Some of these +are fairly simple (the valid properties and subnodes for each node, the +allowable values for properties) and some are more complex (where phandles +are allowed to point). + +The schema is defined by Python objects containing variable SchemaElement +subclasses. Each subclass defines how the device tree property is validated. +For strings this is via a regex. Phandles properties are validated by the +target they are expected to point to. + +Schema elements can be optional or required. Optional elements will not cause +a failure if the node does not include them. + +The presence or absense of a particular schema element can also be controlled +by a 'cond_props' option. This lists elements that must (or must not) +be present in the node for this element to be present. This provides some +flexibility where the schema for a node has two options, for example, where +the presence of one element conflicts with the presence of others. + +Usage: + The validator can be run like this (set PYTHONPATH to the directory with + libfdt.py): + + KERNEL=/path/to/kernel + PYTHONPATH=pylibfdt python validate_dts.py -k \ + $KERNEL/arch/arm/boot/dts/zynq-zybo.dts -d + + The output format for each input file is the name of the file followed by a + list of validation problems. If there are no problems, the filename is not + shown. + + Unit tests have been removed from this proof-of-concept version. + + +Theory of operation (in brief): + This first compiles the .dts source, then reads it in, then validates it + against the schema. The schema is obtained from the kernel source tree by + scanning for .py files containing schema for particular compatible strings. +""" + +from __future__ import print_function + +import argparse +import copy +import itertools +import os +import re +import sys + +# importlib was introduced in Python 2.7 but there was a report of it not +# working in 2.7.12, so we work around this: +# http://lists.denx.de/pipermail/u-boot/2016-October/269729.html +try: + import importlib + have_importlib = True +except: + have_importlib = False + +import fdt, fdt_util +from kschema import NodeAny, NodeDesc, NodeModel, NodeByPath +from kschema import PropCustom, PropDesc, PropString, PropStringList +from kschema import PropPhandleTarget, PropPhandle, CheckPhandleTarget +from kschema import PropAny, PropBool, PropFile, PropFloat, PropIntList +from kschema import SchemaElement, PropInt + +def ParseArgv(argv): + """Parse the available arguments. + + Invalid arguments or -h cause this function to print a message and exit. + + Args: + argv: List of string arguments (excluding program name / argv[0]) + + Returns: + argparse.Namespace object containing the attributes. + """ + parser = argparse.ArgumentParser(description=__doc__) + parser.add_argument('-d', '--debug', action='store_true', + help='Run in debug mode (full exception traceback)') + parser.add_argument('-k', '--kernel', action='store_true', + help='Search kernel bindings when compiling') + parser.add_argument('-p', '--partial', action='store_true', + help='Validate a list of partial files (.dtsi) individually') + parser.add_argument('-r', '--raise-on-error', action='store_true', + help='Causes the validator to raise on the first ' + + 'error it finds. This is useful for debugging.') + parser.add_argument('config', type=str, nargs='+', + help='Paths to the config files (.dtb) to validated') + return parser.parse_args(argv) + + +class CrosConfigValidator(object): + """Validator for the master configuration + + Properties: + _errors: List of validation errors detected (each a string) + _fdt: fdt.Fdt object containing device tree to validate + _raise_on_error: True if the validator should raise on the first error + (useful for debugging) + _schema: Schema used for validation, a dict: + key: Compatible string + value: NodeDesc object containing schema for that compatible string + _kernel: True if we are performing validation for the kernel. This + tries to automatically add the dt-bindings search path + _schema_by_path: Schema for each node path, used when the nodes does + not have a compatible string, but still needs schema. Dict: + key: node Path + value: List of NodeDesc objects + _settings: Global settings for validation, dict: + key: setting (e.g. '#arch') + value: value for that setting (e.g. 'armv8') + _imported_elments: Partial schema imported from a file that much be + merged into the full schema. This allows a schema file to + """ + def __init__(self, schema, raise_on_error, kernel, settings): + self._errors = [] + self._fdt = None + self._raise_on_error = raise_on_error + self._schema = schema + self._kernel = kernel + self._schema_by_path = {} + self._settings = settings or {} + self._imported_elments = [] + + def Fail(self, location, msg): + """Record a validation failure + + Args: + location: fdt.Node object where the error occurred + msg: Message to record for this failure + """ + self._errors.append('%s: %s' % (location, msg)) + if self._raise_on_error: + raise ValueError(self._errors[-1]) + + def _CheckCondition(self, name, value, node_target, schema_target): + """Check whether a required schema condition is true + + This looks at the value of a setting to see if it matches what is + required for this schema element. + + Args: + name: Name of setting ('#setting'), or path to the target node + which needs to be checked ('../...') + value: Required value for the setting + node_target: Node that is being checked + schema_target: NodeDesc for the schema element for this node + + Returns: + True if the condition is met + False if the condition is not met + """ + if name.startswith('#'): + if name not in self._settings: + self.Fail(node_target.path, "Setting '%s' does not exist" % + name) + return False + actual = self._settings[name] + if value.startswith('!'): + if value[1:] == actual: + return False + elif value != actual: + return False + return True + + while name.startswith('../'): + schema_target = schema_target.parent + node_target = node_target.parent + name = name[3:] + actual = node_target.props.get(name) + if actual is not None: + if actual.value != value: + return False + return True + + def ElementPresent(self, schema, parent_node): + """Check whether a schema element should be present + + This handles the cond_props feature. The list of names of sibling + nodes/properties that are actually present is checked to see if any of them + conflict with the conditional properties for this node. If there is a + conflict, then this element is considered to be absent. + + Args: + schema: Schema element to check + parent_node: Parent fdt.Node containing this schema element (or None if + this is not known) + + Returns: + True if this element is present, False if absent + """ + if schema.cond_props and parent_node: + for rel_name, value in schema.cond_props.items(): + if not self._CheckCondition(rel_name, value, parent_node, + schema.parent): + return False + elif schema.cond_props: + print ('if schema.cond_props') + return True + + def GetElement(self, schema, name, node, expected=None): + """Get an element from the schema by name + + Args: + schema: Schema element to check + name: Name of element to find (string) + node: Node contaElementPresentining the property (or for nodes, the parent node + containing the subnode) we are looking up. None if none available + expected: The SchemaElement object that is expected. This can be NodeDesc + if a node is expected, PropDesc if a property is expected, or None + if either is fine. + + Returns: + Tuple: + Schema for the node, or None if none found + True if the node should have schema, False if it can be ignored + (because it is internal to the device-tree format) + """ + for element in schema.elements: + if not self.ElementPresent(element, node): + continue + if element.NameMatches(name): + return element, True + #elif '@' in name and element.name == name.split('@')[0]: + #return element, True + elif ((expected is None or expected == NodeDesc) and + isinstance(element, NodeAny)): + return element, True + elif ((expected is None or expected == PropDesc) and + isinstance(element, PropAny)): + return element, True + if expected == PropDesc: + if name == 'linux,phandle': + return None, False + return None, True + + def GetElementByPath(self, path): + """Find a schema element given its full path + + Args: + path: Full path to look up (e.g. '/chromeos/models/MODEL/thermal/dptf-dv') + + Returns: + SchemaElement object for that path + + Raises: + AttributeError if not found + """ + parts = path.split('/')[1:] + schema = self._schema + for part in parts: + element, _ = self.GetElement(schema, part, None) + schema = element + return schema + + def _ValidateSchema(self, node, schema): + """Simple validation of properties. + + This only handles simple mistakes like getting the name wrong. It + cannot handle relationships between different properties. + + Args: + node: fdt.Node where the property appears + schema: NodeDesc containing schema for this node + """ + schema.Validate(self, node) + schema_props = [e.name for e in schema.elements + if isinstance(e, PropDesc) and + self.ElementPresent(e, node)] + + # Validate each property and check that there are no extra properties not + # mentioned in the schema. + for prop_name in node.props.keys(): + if prop_name == 'linux,phandle': # Ignore this (use 'phandle' instead) + continue + element, _ = self.GetElement(schema, prop_name, node, PropDesc) + if not element or not isinstance(element, PropDesc): + if prop_name == 'phandle': + self.Fail(node.path, 'phandle target not valid for this node') + else: + self.Fail(node.path, "Unexpected property '%s', valid list is (%s)" % + (prop_name, ', '.join(schema_props))) + continue + element.Validate(self, node.props[prop_name]) + + # Check that there are no required properties which we don't have + for element in schema.elements: + if (not isinstance(element, PropDesc) or + not self.ElementPresent(element, node)): + continue + if element.required and element.name not in node.props.keys(): + self.Fail(node.path, "Required property '%s' missing" % element.name) + + # Check that any required subnodes are present + subnode_names = [n.name for n in node.subnodes.values()] + for element in schema.elements: + if (not isinstance(element, NodeDesc) or not element.required + or not self.ElementPresent(element, node)): + continue + if element.name not in subnode_names: + msg = "Missing subnode '%s'" % element.name + if subnode_names: + msg += ' in %s' % ', '.join(subnode_names) + self.Fail(node.path, msg) + + def GetSchema(self, node, parent_schema): + """Obtain the schema for a subnode + + This finds the schema for a subnode, by scanning for a matching element. + + Args: + node: fdt.Node whose schema we are searching for + parent_schema: Schema for the parent node, which contains that schema + + Returns: + Schema for the node, or None if none found + """ + schema, needed = self.GetElement(parent_schema, node.name, node.parent, + NodeDesc) + if not schema and needed: + elements = [e.name for e in parent_schema.GetNodes() + if self.ElementPresent(e, node.parent)] + self.Fail(os.path.dirname(node.path), + "Unexpected subnode '%s', valid list is (%s)" % + (node.name, ', '.join(elements))) + return schema + + def _ImportSchemaFile(self, dirpath, module_name, priority): + """Import a schema file from the kernel + + Args: + dirpath: Path to directory containing the schema file + module_name: Name of module to load (module.py) + priority: Numbered priority to load (0 for none, 1 for highest) + + Returns: + True if everthing went OK + None if the module has no schema + False if the module should have had schema but no schema was found + """ + old_path = sys.path + sys.path.insert(0, dirpath) + try: + if have_importlib: + module = importlib.import_module(module_name) + else: + module = __import__(module_name) + except ImportError as e: + raise + raise ValueError("Bad schema module '%s', error '%s'" % + (os.path.join(dirpath, module_name), e)) + finally: + sys.path = old_path + if getattr(module, 'no_schema', None): + return None + attr_name = 'schema%d' % priority if priority else 'schema' + schema = getattr(module, attr_name, None) + if not schema: + return False + for element in schema: + bad = False + # Most elements have a list of compatible strings for which they + # provide the schema. + if element.compat: + for compat in element.compat: + self._schema[compat] = element + + # Some elements have no compatible string, but relate to a + # particular path in the DT. + elif hasattr(element, 'path'): + self._schema_by_path[element.path] = element + + # Or perhaps we have an additional piece of schema which needs to + # be merged with an existing element. + elif priority: + bad = True + for orig in self._imported_elments: + if (orig.name == element.name and + isinstance(element, type(orig))): + #print("found '%s' for '%s'" % (orig.name, element.name)) + for elem in element.elements: + orig.elements.append(elem) + bad = False + element = None # Don't record this additive element + break + + # Or maybe there has just been some mistake + if bad: + self.Fail("Module '%s', var '%s', element '%s'" % + (module_name, attr_name, element.name), + 'Node must have compatible string or path') + if element: + self._imported_elments.append(element) + + return True + + def _GetSchemaFiles(self, schema_path): + """Find all schema files in a given path + + Args: + schema_path: Path to schema, e.g. 'Documentation/devicetree/binding' + + Returns: + List of schema files, each: + List containing: + Directory path + Base name of module (with the '.py') + """ + file_list = [] + for (dirpath, dirnames, fnames) in os.walk(schema_path): + for fname in fnames: + base, ext = os.path.splitext(fname) + if ext == '.py' and not base.startswith('_'): + #print("Importing '%s/%s'" % (dirpath, fname)) + file_list.append([dirpath, base]) + return file_list + + def _LoadSchema(self, schema_path): + """Locate and load all the schema files + + This looks in the given path for .py schema files and loads them. + + Args: + schema_path: Root path to look for Python files + """ + remaining_list = self._GetSchemaFiles(schema_path) + priority = 0 + while remaining_list: + leftover = [] + for dirpath, base in remaining_list: + loaded = self._ImportSchemaFile(dirpath, base, priority) + if loaded == False: + leftover.append([dirpath, base]) + remaining_list = leftover + priority += 1 + if priority > 9: + self.Fail(schema_path, + 'Cannot locate schema in files: %s' % + ', '.join([fname for fname, name in remaining_list])) + break + + def _ValidateTree(self, node, parent_schema): + """Validate a node and all its subnodes recursively + + Args: + node: name of fdt.Node to search for + parent_schema: Schema for the parent node + """ + schema = None + base_path = node.path.split('@')[0] + + # Normal case: compatible string specifies the schema + compats = [] + if 'compatible' in node.props: + compats = fdt_util.GetCompatibleList(node) + for compat in compats: + if compat in self._schema: + schema = self._schema[compat] + + # Schema for some nodes is specified by their path (e.g. /cpu) + elif base_path in self._schema_by_path: + schema = self._schema_by_path[base_path] + #print(self._schema_by_path) + + # Schema may be in a child element of this schema + elif isinstance(parent_schema, SchemaElement): + schema = self.GetSchema(node, parent_schema) + if isinstance(schema, NodeByPath): + self.Fail(node.path, 'No schema found for this path %s' % schema) + return + + if schema is None: + print('No schema for: %s' % (', '.join(compats))) + #return + + if schema: + self._ValidateSchema(node, schema) + for subnode in node.subnodes.values(): + self._ValidateTree(subnode, schema or parent_schema) + + # This is not actually used - it's just an example of the more complex + # validation possible with this validator. Here we check for duplicate + # values as well as a range, plus we look to make sure the phandle target + # points to a suitable node. + @staticmethod + def ValidateSkuMap(val, prop): + it = iter(prop.value) + sku_set = set() + for sku, phandle in itertools.izip(it, it): + sku_id = fdt_util.fdt32_to_cpu(sku) + # Allow a SKU ID of -1 as a valid match. + if sku_id > 0xffff and sku_id != 0xffffffff: + val.Fail(prop.node.path, 'sku_id %d out of range' % sku_id) + if sku_id in sku_set: + val.Fail(prop.node.path, 'Duplicate sku_id %d' % sku_id) + sku_set.add(sku_id) + phandle_val = fdt_util.fdt32_to_cpu(phandle) + target = prop.fdt.LookupPhandle(phandle_val) + if (not CheckPhandleTarget(val, target, '/chromeos/models/MODEL') and + not CheckPhandleTarget(val, target, + '/chromeos/models/MODEL/submodels/SUBMODEL')): + val.Fail(prop.node.path, + "Phandle '%s' sku-id %d must target a model or submodel'" % + (prop.name, sku_id)) + + def Prepare(self, _fdt): + """Get ready to valid a DT file""" + self._fdt = _fdt + + def Start(self, fnames, partial=False): + """Start validating a DT file + + Args: + fnames: List of filenames containing the configuration to validate. + Supports compiled .dtb files and source .dts files, If + partial is False then there can be only one filename in the + list. + partial: True to process a list of partial config files (.dtsi) + """ + tmpfile = None + self._errors = [] + try: + if partial: + dtb, tmpfile = fdt_util.CompileAll(fnames) + else: + search_paths = [os.path.join(os.getcwd(), 'include')] + if self._kernel: + # Add kernel bindings dir if found + pathname = os.path.dirname(fnames[0]) + dirs = [] + for items in range(4): + pathname, dirname = os.path.split(pathname) + dirs.insert(0, dirname) + if dirs[-2:] == ['boot', 'dts']: + search_paths.append(os.path.join(pathname, 'include')) + + dtb, tmpfile = fdt_util.EnsureCompiled(fnames[0], search_paths) + schema_path = os.path.join(pathname, 'Documentation', + 'devicetree', 'bindings') + self._LoadSchema(schema_path) + self.Prepare(fdt.FdtScan(dtb)) + + self._ValidateTree(self._fdt.GetRoot(), self._schema) + finally: + if tmpfile: + os.unlink(tmpfile.name) + return self._errors + + +"""This is the schema. It is a hierarchical set of nodes and properties, just +like the device tree. If an object subclasses NodeDesc then it is a node, +possibly with properties and subnodes. + +In this way it is possible to describe the schema in a fairly natural, +hierarchical way. + +# Note: The schema starts off empty and is read from .py files in the kernel. +This seems like a better approach than trying to have the schema all in one +file. +""" +SCHEMA = {} + + +def ShowErrors(fname, errors): + """Show validation errors + + Args: + fname: Filename containng the errors + errors: List of errors, each a string + """ + print('%s:' % fname, file=sys.stderr) + for error in errors: + print(error, file=sys.stderr) + print(file=sys.stderr) + + +def Main(argv=None): + """Main program for validator + + This validates each of the provided files and prints the errors for each, if + any. + + Args: + argv: Arguments to the problem (excluding argv[0]); if None, uses + sys.argv + """ + if argv is None: + argv = sys.argv[1:] + args = ParseArgv(argv) + settings = {'#arch': 'armv7'} + validator = CrosConfigValidator(SCHEMA, args.raise_on_error, args.kernel, + settings) + found_errors = False + try: + # If we are given partial files (.dtsi) then we compile them all into one + # .dtb and validate that. + if args.partial: + errors = validator.Start(args.config, partial=True) + fname = args.config[0] + if errors: + ShowErrors(fname, errors) + found_errors = True + + # Otherwise process each file individually + else: + for fname in args.config: + errors = validator.Start([fname]) + if errors: + found_errors = True + if errors: + ShowErrors(fname, errors) + found_errors = True + except ValueError as e: + if args.debug: + raise + print('Failed: %s' % e, file=sys.stderr) + found_errors = True + if found_errors: + sys.exit(1) + + +if __name__ == "__main__": + Main() -- 2.21.0.593.g511ec345e18-goog