Exploring TensorFlow’s Multi-Level APIs: Low-Level Ops to High-Level Keras
This notebook investigates how TensorFlow’s different API layers — from low-level tensor operations to high-level Keras abstractions — interact and can be composed. The aim is to:
- Understand the flexibility of TensorFlow’s multi-level design.
- Demonstrate equivalent operations at different abstraction levels.
- Compare workflows for creating and training models using these APIs.
The notebook follows an incremental abstraction approach:
- Low-level TensorFlow ops – Define computations directly with
tf.constant,tf.Variable, and math operations. - Mid-level Layers API – Build models using
tf.keras.layerswith manual weight initialization and forward passes. - High-level Keras API – Use
tf.keras.Sequentialandtf.keras.Modelfor rapid prototyping, training, and evaluation. - Mix-and-match strategy – Combine low-level control with high-level convenience for custom model building.
- Explore dataset pipelines using
tf.data.Datasetfor feeding data to models.
From the code:
- TensorFlow – core ops, variables, optimizers, datasets.
- Keras (via TensorFlow) –
Sequential,Model,Dense,Flatten, and other layers. - NumPy – array creation and manipulation.
Not provided – The notebook generates synthetic data (NumPy arrays / tensors) for demonstrations. No external dataset is loaded.
Requirements:
pip install tensorflow numpyRun the notebook:
jupyter notebook multi_level_apis_me.ipynbor in JupyterLab:
jupyter lab multi_level_apis_me.ipynbExecute cells sequentially to reproduce the examples and outputs.
- Low-level: Manual tensor math produced correct forward passes and weight updates.
- Mid-level:
tf.keras.layers.Densewith manual calls yielded same functional results with less boilerplate. - High-level:
SequentialandModelAPIs allowed rapid building, compilation, and fitting.
- Creation of
tf.data.Datasetfrom tensors demonstrated efficient batch processing.
Representative console outputs from the notebook:
Eager execution results: [ ... ]
Layer API output shape: (batch_size, units)
Sequential model training: loss decreased over epochs
(Exact values depend on random seeds and synthetic data.)
(Extracted from notebook outputs)
Low-Level Tensor Math:
<tf.Tensor: shape=(...), dtype=float32, numpy=...>
Keras Sequential Summary:
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, X) Y
...
=================================================================
- TensorFlow offers progressive abstraction layers — allowing developers to trade off between control and convenience.
- The Layer API is a practical balance: low enough for customization, high enough to avoid manual graph wiring.
- The Sequential / Model API is ideal for rapid iteration but can be combined with low-level ops for advanced needs.
tf.data.Datasetintegrates seamlessly with all API levels, providing efficient input pipelines.
💡 Some interactive outputs (e.g., plots, widgets) may not display correctly on GitHub. If so, please view this notebook via nbviewer.org for full rendering.
Mehran Asgari Email: imehranasgari@gmail.com GitHub: https://github.com/imehranasgari
This project is licensed under the Apache 2.0 License – see the LICENSE file for details.