Vengineerの妄想(準備期間)

人生は短いけど、長いです。人生を楽しみましょう!

TensorFlow XLAのXRT client


TensorFlow XLAのXRTに、[XRT Add an prototype XRT client that communicates via the TF remote…]と。。。

[XRT] Add an prototype XRT client that communicates via the TF remote eager RPC API.

This client provides a new C++-based path to talk to XRT.

とあります。

そして、その後、

[XLA:Python Add a prototype XRT client that runs XRT ops using the TensorFlow remote eager protocol.]

と、Python Interfaceも。。。

下記のように、Pythonで書けるようです。
"""Tests for the XRT client."""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

import numpy as np

from tensorflow.compiler.xla.python import xla_client
from tensorflow.compiler.xla.python import xrt
from tensorflow.python.platform import test


def BuildAddAndScaleComputation(shape1, shape2):
  """Builds the computation (a + b) * 3."""
  b = xla_client.ComputationBuilder("add-and-scale")
  x = b.ParameterWithShape(shape1)
  y = b.ParameterWithShape(shape2)
  dtype = shape1.numpy_dtype().type
  b.Mul(b.Add(x, y), b.Constant(dtype(3)))
  return b.Build()


# TODO(phawkins): add more tests, beyond a simple "hello world" example.
class XrtBackendTest(test.TestCase):

  def testBasics(self):
    (worker,), _ = test.create_local_cluster(num_workers=1, num_ps=0)
    self.assertTrue(worker.target.startswith("grpc://"))
    tf_context = xrt.get_tf_context(worker.target[len("grpc://"):], "worker")
    backend = xrt.XrtBackend(tf_context, "XLA_CPU")

    a = np.arange(10)
    b = np.arange(10)

    c = BuildAddAndScaleComputation(
        xla_client.Shape.from_pyval(a), xla_client.Shape.from_pyval(b))

    executable = c.Compile(backend=backend)
    output = executable.ExecuteWithPythonValues((a, b))
    self.assertAllEqual(output, (a + b) * 3)