Skip to content

Commit

Permalink
[SPARK-37668][PYTHON] 'Index' object has no attribute 'levels' in pys…
Browse files Browse the repository at this point in the history
…park.pandas.frame.DataFrame.insert

### What changes were proposed in this pull request?

This PR proposes to address the unexpected error in `pyspark.pandas.frame.DataFrame.insert`.

Assigning tuple as column name is only supported for MultiIndex Columns for now in pandas API on Spark:

```python
# MultiIndex column
>>> psdf
   x
   y
0  1
1  2
2  3
>>> psdf[('a', 'b')] = [4, 5, 6]
>>> psdf
   x  a
   y  b
0  1  4
1  2  5
2  3  6

# However, not supported for non-MultiIndex column
>>> psdf
   A
0  1
1  2
2  3
>>> psdf[('a', 'b')] = [4, 5, 6]
Traceback (most recent call last):
...
KeyError: 'Key length (2) exceeds index depth (1)'
```

So, we should show proper error message rather than `AttributeError: 'Index' object has no attribute 'levels'` when users try to insert the tuple named column.

**Before**
```python
>>> psdf.insert(0, ("a", "b"), 10)
Traceback (most recent call last):
...
AttributeError: 'Index' object has no attribute 'levels'
```

**After**
```python
>>> psdf.insert(0, ("a", "b"), 10)
Traceback (most recent call last):
...
NotImplementedError: Assigning column name as tuple is only supported for MultiIndex columns for now.
```

### Why are the changes needed?

Let users know proper usage.

### Does this PR introduce _any_ user-facing change?

Yes, the exception message is changed as described in the **After**.

### How was this patch tested?

Unittests.

Closes apache#34957 from itholic/SPARK-37668.

Authored-by: itholic <[email protected]>
Signed-off-by: Hyukjin Kwon <[email protected]>
  • Loading branch information
itholic authored and HyukjinKwon committed Dec 23, 2021
1 parent 198b90c commit f6be769
Show file tree
Hide file tree
Showing 2 changed files with 21 additions and 4 deletions.
14 changes: 10 additions & 4 deletions python/pyspark/pandas/frame.py
Original file line number Diff line number Diff line change
Expand Up @@ -3988,13 +3988,19 @@ def insert(
'"column" should be a scalar value or tuple that contains scalar values'
)

# TODO(SPARK-37723): Support tuple for non-MultiIndex column name.
if is_name_like_tuple(column):
if len(column) != len(self.columns.levels): # type: ignore[attr-defined] # SPARK-37668
# To be consistent with pandas
raise ValueError('"column" must have length equal to number of column levels.')
if self._internal.column_labels_level > 1:
if len(column) != len(self.columns.levels): # type: ignore[attr-defined]
# To be consistent with pandas
raise ValueError('"column" must have length equal to number of column levels.')
else:
raise NotImplementedError(
"Assigning column name as tuple is only supported for MultiIndex columns for now."
)

if column in self.columns:
raise ValueError("cannot insert %s, already exists" % column)
raise ValueError("cannot insert %s, already exists" % str(column))

psdf = self.copy()
psdf[column] = value
Expand Down
11 changes: 11 additions & 0 deletions python/pyspark/pandas/tests/test_dataframe.py
Original file line number Diff line number Diff line change
Expand Up @@ -223,6 +223,12 @@ def test_insert(self):
"loc must be int",
lambda: psdf.insert((1,), "b", 10),
)
self.assertRaisesRegex(
NotImplementedError,
"Assigning column name as tuple is only supported for MultiIndex columns for now.",
lambda: psdf.insert(0, ("e",), 10),
)

self.assertRaises(ValueError, lambda: psdf.insert(0, "e", [7, 8, 9, 10]))
self.assertRaises(ValueError, lambda: psdf.insert(0, "f", ps.Series([7, 8])))
self.assertRaises(AssertionError, lambda: psdf.insert(100, "y", psser))
Expand All @@ -247,6 +253,11 @@ def test_insert(self):
self.assertRaisesRegex(
ValueError, "cannot insert d, already exists", lambda: psdf.insert(4, "d", 11)
)
self.assertRaisesRegex(
ValueError,
r"cannot insert \('x', 'a', 'b'\), already exists",
lambda: psdf.insert(4, ("x", "a", "b"), 11),
)
self.assertRaisesRegex(
ValueError,
'"column" must have length equal to number of column levels.',
Expand Down

0 comments on commit f6be769

Please sign in to comment.