The dataset viewer is not available for this split.
Error code: StreamingRowsError
Exception: CastError
Message: Couldn't cast
sample_id: int32
time_step: int32
readings: list<element: float>
child 0, element: float
to
{'sample_id': Value('int32'), 'time_step': Value('int32'), 'channels': ClassLabel(names=['phi', 'mu', 'w', 'u']), 'data': Array2D(shape=(128, 128), dtype='float32')}
because column names don't match
Traceback: Traceback (most recent call last):
File "/src/services/worker/src/worker/utils.py", line 99, in get_rows_or_raise
return get_rows(
^^^^^^^^^
File "/src/libs/libcommon/src/libcommon/utils.py", line 272, in decorator
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/utils.py", line 77, in get_rows
rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 2431, in __iter__
for key, example in ex_iterable:
^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 1952, in __iter__
for key, pa_table in self._iter_arrow():
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 1975, in _iter_arrow
for key, pa_table in self.ex_iterable._iter_arrow():
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 503, in _iter_arrow
for key, pa_table in iterator:
^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/iterable_dataset.py", line 350, in _iter_arrow
for key, pa_table in self.generate_tables_fn(**gen_kwags):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/parquet/parquet.py", line 106, in _generate_tables
yield f"{file_idx}_{batch_idx}", self._cast_table(pa_table)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/parquet/parquet.py", line 73, in _cast_table
pa_table = table_cast(pa_table, self.info.features.arrow_schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
return cast_table_to_schema(table, schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2218, in cast_table_to_schema
raise CastError(
datasets.table.CastError: Couldn't cast
sample_id: int32
time_step: int32
readings: list<element: float>
child 0, element: float
to
{'sample_id': Value('int32'), 'time_step': Value('int32'), 'channels': ClassLabel(names=['phi', 'mu', 'w', 'u']), 'data': Array2D(shape=(128, 128), dtype='float32')}
because column names don't matchNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
Dataset Card: Viscous Cahn-Hilliard Optimal Control
Dataset Summary
This dataset contains 2,000 high-fidelity simulations of the Viscous Cahn-Hilliard (vCH) equation under randomized control forcing. It was generated to support research into Sparse Optimal Control, SciML (Scientific Machine Learning), and Phase Field Modeling. Official Code Repository: Sparse-optimal-control-of-Viscous-Chan-hilliard (GitHub) Each sample represents the evolution of a two-phase system (separation of immiscible components) steered by an external control field. The data was generated using a custom high-precision Finite Difference solver with Crank-Nicolson time integration.
Dataset Structure
Tensor Shape
The data is stored as a 5D tensor (logically), though physically sharded into Parquet files.
- Shape:
(2000, 101, 128, 128, 4) - Dimensions:
(Samples, Time Steps, Height, Width, Channels)
Channel Definitions
Based on the data_process.py pipeline, the 4 channels correspond to:
- Channel 0 (
phi/ $\varphi$): The phase-field order parameter ($\varphi \in [-1, 1]$). Represents the concentration difference between the two phases. - Channel 1 (
mu/ $\mu$): The chemical potential. Defined as $\mu = -\kappa\Delta\varphi + f'(\varphi) - w$. - Channel 2 (
w): The auxiliary control variable. Governed by the relaxation equation $\gamma \partial_t w + w = u$. - Channel 3 (
u): The external control forcing field applied to the system.
Physical Domain
- Spatial Domain: $\Omega = [0, 1] \times [0, 1]$
- Grid Resolution: $128 \times 128$ (uniform grid spacing $h \approx 0.0078$).
- Time Horizon: $T = 1.0$ seconds.
- Time Resolution: $dt = 0.01$ (101 frames per simulation).
Data Generation Details
1. The Physics Model
The data follows the Viscous Cahn-Hilliard System with a logarithmic potential, ensuring the phase field stays strictly within physical bounds $(-1, 1)$.
2. Numerical Implementation
The solver (Forward2_solver.py) utilizes:
- Spatial Discretization: Standard 5-point Finite Difference Stencil with Neumann Boundary Conditions (Ghost Points).
- Time Integration: Semi-Implicit Crank-Nicolson scheme.
- Implicit: Linear diffusion and convex part of the potential ($c_1$).
- Explicit: Concave part of the potential ($c_2$).
- Nonlinear Solver: Monolithic Newton-Raphson with Armijo backtracking line search.
- Mass Conservation: Enforced via a global Lagrange multiplier correction step.
3. Simulation Parameters
The dataset was generated using the configuration from config.py:
| Parameter | Symbol | Value | Description |
|---|---|---|---|
| Viscosity | $\tau$ | 0.05 |
Relaxation time for the phase equation. |
| Control Relaxation | $\gamma$ | 10.0 |
Damping factor for the control variable $w$. |
| Interface Width | $\kappa$ | 9.0e-4 |
Gradient energy coefficient (controls interface thickness). |
| Potential (Convex) | $c_1$ | 0.75 |
Flory-Huggins logarithmic coefficient. |
| Potential (Concave) | $c_2$ | 1.0 |
Quadratic destabilizing coefficient. |
| Time Step | $dt$ | 0.01 |
Sampling rate (solver internal steps may be smaller). |
4. Input Control Generation
The control fields u were generated procedurally using Smooth Random Fields:
- Random noise is generated in Fourier space.
- A low-pass filter ($K^{-1.5}$) is applied to ensure smoothness.
- Keyframes are interpolated over time using cubic splines to create continuous, varying forces.
How to Load (Python)
import pandas as pd
import numpy as np
from datasets import load_dataset
# 1. Load the dataset (streaming recommended for 50GB)
dataset = load_dataset("your-username/your-repo-name", split="train", streaming=True)
# 2. Iterate through samples
for row in dataset:
# 3. Reshape the flattened arrays back to (128, 128, 4)
# Assuming your Parquet conversion flattened the spatial dims
sample_id = row['sample_id']
timestep = row['time_step']
# Shape depends on your specific parquet conversion script
# Example for flattened 1D array column:
data_flat = np.array(row['readings'])
frame_tensor = data_flat.reshape(128, 128, 4)
phi = frame_tensor[:, :, 0] # Phase field
u = frame_tensor[:, :, 3] # Control input
print(f"Sample {sample_id} at t={timestep}: Phi range [{phi.min():.2f}, {phi.max():.2f}]")
break
- Downloads last month
- 5