Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Drop Python 3.8, 3.9 and add Python 3.12 #307

Merged
merged 7 commits into from
Jul 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 3 additions & 9 deletions .github/workflows/build_upload_pypi_wheels.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ jobs:
strategy:
matrix:
os: [windows-latest, macos-13, ubuntu-latest]
python-version: ['3.8', '3.9', '3.10', '3.11']
python-version: ['3.10', '3.11']
include:
- os: windows-latest
wheelname: win
Expand All @@ -19,16 +19,10 @@ jobs:
- os: ubuntu-latest
wheelname: manylinux
# Build wheels against the lowest compatible Numpy version
- python-version: 3.8
manylinux-version-tag: cp38
numpy-version: 1.19.5
- python-version: 3.9
manylinux-version-tag: cp39
numpy-version: 1.19.5
- python-version: 3.10
- python-version: '3.10'
manylinux-version-tag: cp310
numpy-version: 1.21.3
- python-version: 3.11
- python-version: '3.11'
manylinux-version-tag: cp311
numpy-version: 1.23.2
fail-fast: false
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/create-landing-page.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
ref: gh-pages
- uses: actions/setup-python@v4
with:
python-version: 3.8
python-version: '3.10'
- name: Update pip and install dependencies
run: |
python -m pip install --upgrade pip
Expand Down
8 changes: 4 additions & 4 deletions .github/workflows/run_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ jobs:
- uses: actions/checkout@v4
- uses: conda-incubator/setup-miniconda@v3
with:
python-version: 3.8
python-version: '3.10'
channels: conda-forge,defaults
channel-priority: true
- name: Install llvm on Macos
Expand All @@ -36,10 +36,10 @@ jobs:
run: |
python -m pip install --upgrade pip
python -m pip install -r tests_and_analysis/ci_requirements.txt
- name: Run tests, skip Python 3.9, 3.10 unless workflow dispatch
- name: Run tests, skip Python 3.11 unless workflow dispatch
if: github.event_name != 'workflow_dispatch'
env:
TOX_SKIP_ENV: '.*?(py39|py310).*?'
TOX_SKIP_ENV: '.*?(py311).*?'
shell: bash -l {0}
run: python -m tox
- name: Run tests, workflow dispatch so test all Python versions
Expand Down Expand Up @@ -70,7 +70,7 @@ jobs:
- uses: actions/checkout@v4
- uses: conda-incubator/setup-miniconda@v3
with:
python-version: 3.8
python-version: '3.10'
channels: conda-forge,defaults
channel-priority: true
- name: Update pip and install dependencies
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/test_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,14 +10,14 @@ jobs:
test:
strategy:
matrix:
os: [ubuntu-latest, windows-latest, macos-latest]
os: [ubuntu-latest, windows-latest, macos-latest, macos-13]
fail-fast: false
runs-on: ${{ matrix.os }}
steps:
- uses: actions/checkout@v3
- uses: conda-incubator/setup-miniconda@v2
with:
python-version: 3.8
python-version: '3.10'
channels: conda-forge,defaults
channel-priority: true
- name: Install llvm on Macos
Expand Down
2 changes: 1 addition & 1 deletion .readthedocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ sphinx:
build:
os: ubuntu-22.04
tools:
python: "3.8"
python: "3.10"

# Optionally set the version of Python and requirements required to build your docs
python:
Expand Down
2 changes: 1 addition & 1 deletion doc/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
numpy>=1.14.5
numpy>=1.21.3
sphinx==5.3.0
sphinx-argparse==0.3.2
sphinx-autodoc-typehints==1.19.5
Expand Down
2 changes: 1 addition & 1 deletion doc/source/cite.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ or it can be read programatically as follows:

import yaml
import euphonic
from importlib_resources import files
from importlib.resources import files

with open(files(euphonic) / 'CITATION.cff') as fp:
citation_data = yaml.safe_load(fp)
Expand Down
4 changes: 2 additions & 2 deletions doc/source/installation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Installation

.. contents:: :local:

Euphonic has been tested on Python 3.8 - 3.10.
Euphonic has been tested on Python 3.10 - 3.12.

Pip
===
Expand Down Expand Up @@ -47,7 +47,7 @@ To create a "complete" installation in a new environment:

.. code-block:: bash

conda create -n euphonic-forge -c conda-forge python=3.8 euphonic matplotlib-base pyyaml tqdm h5py
conda create -n euphonic-forge -c conda-forge python=3.10 euphonic matplotlib-base pyyaml tqdm h5py

This creates an environment named "euphonic-forge", which can be
entered with ``activate euphonic-forge`` and exited with
Expand Down
3 changes: 2 additions & 1 deletion euphonic/__init__.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
from importlib.resources import files

Check notice on line 1 in euphonic/__init__.py

View check run for this annotation

Codacy Production / Codacy Static Code Analysis

euphonic/__init__.py#L1

Missing module docstring

from . import _version
__version__ = _version.get_versions()['version']

import pint
from pint import UnitRegistry
from importlib_resources import files

# Create ureg here so it is only created once
ureg = UnitRegistry()
Expand Down
2 changes: 1 addition & 1 deletion euphonic/readers/castep.py
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ def read_phonon_dos_data(
_, idx = np.unique(atom_type, return_index=True)
unique_types = atom_type[np.sort(idx)]
for i, species in enumerate(unique_types):
dos_dict[species] = dos_data[:, i + 2]/dos_conv
dos_dict[str(species)] = dos_data[:, i + 2]/dos_conv

return data_dict

Expand Down
2 changes: 1 addition & 1 deletion euphonic/styles/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
"""Matplotlib stylesheets for plot styling"""
from importlib_resources import files
from importlib.resources import files

base_style = files(__package__) / "base.mplstyle"
intensity_widget_style = files(__package__) / "intensity_widget.mplstyle"
56 changes: 27 additions & 29 deletions euphonic/util.py
Original file line number Diff line number Diff line change
@@ -1,17 +1,16 @@
from collections import OrderedDict
from functools import reduce
from importlib.resources import files
import itertools
import json
import math
import os.path
import sys
from typing import Dict, Sequence, Union, Tuple, Optional, List
from typing import Sequence, Optional
import warnings

import numpy as np
import seekpath
from seekpath.hpkot import SymmetryDetectionError
from importlib_resources import files
from pint import UndefinedUnitError

from euphonic import ureg, Quantity
Expand Down Expand Up @@ -52,7 +51,7 @@ def direction_changed(qpts: np.ndarray, tolerance: float = 5e-6
return np.abs(np.abs(dot) - modq[1:]*modq[:-1]) > tolerance


def is_gamma(qpt: np.ndarray) -> Union[bool, np.ndarray]:
def is_gamma(qpt: np.ndarray) -> bool | np.ndarray:
"""
ajjackson marked this conversation as resolved.
Show resolved Hide resolved
Determines whether the given point(s) are gamma points

Expand All @@ -73,7 +72,7 @@ def is_gamma(qpt: np.ndarray) -> Union[bool, np.ndarray]:
return isgamma


def mp_grid(grid: Tuple[int, int, int]) -> np.ndarray:
def mp_grid(grid: tuple[int, int, int]) -> np.ndarray:
"""
Returns the q-points on a MxNxL Monkhorst-Pack grid specified by
grid
Expand Down Expand Up @@ -101,8 +100,8 @@ def mp_grid(grid: Tuple[int, int, int]) -> np.ndarray:
return np.column_stack((qh, qk, ql))


def get_all_origins(max_xyz: Tuple[int, int, int],
min_xyz: Tuple[int, int, int] = (0, 0, 0),
def get_all_origins(max_xyz: tuple[int, int, int],
min_xyz: tuple[int, int, int] = (0, 0, 0),
step: int = 1) -> np.ndarray:
"""
Given the max/min number of cells in each direction, get a list of
Expand Down Expand Up @@ -133,10 +132,10 @@ def get_all_origins(max_xyz: Tuple[int, int, int],


def get_qpoint_labels(qpts: np.ndarray,
cell: Optional[Tuple[List[List[float]],
List[List[float]],
List[int]]] = None
) -> List[Tuple[int, str]]:
cell: Optional[tuple[list[list[float]],
list[list[float]],
list[int]]] = None
) -> list[tuple[int, str]]:
"""
Gets q-point labels (e.g. GAMMA, X, L) for the q-points at which the
path through reciprocal space changes direction, or where a point
Expand Down Expand Up @@ -170,7 +169,7 @@ def get_qpoint_labels(qpts: np.ndarray,

def get_reference_data(collection: str = 'Sears1992',
physical_property: str = 'coherent_scattering_length'
) -> Dict[str, Quantity]:
) -> dict[str, Quantity]:
"""
Get physical data as a dict of (possibly-complex) floats from reference
data.
Expand Down Expand Up @@ -204,7 +203,7 @@ def get_reference_data(collection: str = 'Sears1992',

Returns
-------
Dict[str, Quantity]
dict[str, Quantity]
Requested data as a dict with string keys and (possibly-complex)
float Quantity values. String or None items of the original data file
will be omitted.
Expand Down Expand Up @@ -304,7 +303,7 @@ def convert_fc_phases(force_constants: np.ndarray, atom_r: np.ndarray,
sc_atom_r: np.ndarray, uc_to_sc_atom_idx: np.ndarray,
sc_to_uc_atom_idx: np.ndarray, sc_matrix: np.ndarray,
cell_origins_tol: float = 1e-5
) -> Tuple[np.ndarray, np.ndarray]:
) -> tuple[np.ndarray, np.ndarray]:
"""
Convert from a force constants matrix which uses the atom
coordinates as r in the e^-iq.r phase (Phonopy-like), to a
Expand Down Expand Up @@ -390,7 +389,7 @@ def convert_fc_phases(force_constants: np.ndarray, atom_r: np.ndarray,
# atom 0, so the same cell origins can be used for all atoms
cell_origins_map = np.zeros((n_atoms_sc), dtype=np.int32)
# Get origins of adjacent supercells in prim cell frac coords
sc_origins = get_all_origins((2,2,2), min_xyz=(-1,-1,-1))
sc_origins = get_all_origins((2, 2, 2), min_xyz=(-1, -1, -1))
sc_origins_pcell = np.einsum('ij,jk->ik', sc_origins, sc_matrix)
for i in range(n_atoms_sc):
co_idx = np.where(
Expand Down Expand Up @@ -429,7 +428,7 @@ def convert_fc_phases(force_constants: np.ndarray, atom_r: np.ndarray,
sc_relative_idx = _get_supercell_relative_idx(cell_origins, sc_matrix)
fc_converted[i, sc_relative_idx[cell_idx]] = fc_tmp

fc_converted = np.reshape(np.transpose(
fc_converted = np.reshape(np.transpose(
fc_converted,
axes=[1, 0, 3, 2, 4]), (n_cells, 3*n_atoms_uc, 3*n_atoms_uc))
return fc_converted, cell_origins
Expand All @@ -444,24 +443,23 @@ def _cell_vectors_to_volume(cell_vectors: Quantity) -> Quantity:


def _get_unique_elems_and_idx(
all_elems: Sequence[Tuple[Union[int, str], ...]]
) -> 'OrderedDict[Tuple[Union[int, str], ...], np.ndarray]':
all_elems: Sequence[tuple[int | str, ...]]
) -> dict[tuple[int | str, ...], np.ndarray]:
"""
Returns an ordered dictionary mapping the unique sequences of
elements to their indices
"""
# Abuse OrderedDict to get ordered set
unique_elems = OrderedDict(
zip(all_elems, itertools.cycle([None]))).keys()
return OrderedDict((
# Abuse dict keys to get an "ordered set" of elems for iteration
unique_elems = dict(zip(all_elems, itertools.cycle([None]))).keys()
return dict((
elem,
np.asarray([i for i, other_elem in enumerate(all_elems)
if elem == other_elem])
) for elem in unique_elems)


def _calc_abscissa(reciprocal_cell: Quantity, qpts: np.ndarray
) -> Quantity:
) -> Quantity:
"""
Calculates the distance between q-points (e.g. to use as a plot
x-coordinate)
Expand Down Expand Up @@ -519,10 +517,10 @@ def _calc_abscissa(reciprocal_cell: Quantity, qpts: np.ndarray


def _recip_space_labels(qpts: np.ndarray,
cell: Optional[Tuple[List[List[float]],
List[List[float]],
List[int]]]
) -> Tuple[np.ndarray, np.ndarray]:
cell: Optional[tuple[list[list[float]],
list[list[float]],
list[int]]]
) -> tuple[np.ndarray, np.ndarray]:
"""
Gets q-points point labels (e.g. GAMMA, X, L) for the q-points at
which the path through reciprocal space changes direction or where a
Expand Down Expand Up @@ -592,7 +590,7 @@ def _recip_space_labels(qpts: np.ndarray,
return labels, qpts_with_labels


def _generic_qpt_labels() -> Dict[str, Tuple[float, float, float]]:
def _generic_qpt_labels() -> dict[str, tuple[float, float, float]]:
"""
Returns a dictionary relating fractional q-point label strings to
their coordinates e.g. '1/4 1/2 1/4' = [0.25, 0.5, 0.25]. Used for
Expand All @@ -612,7 +610,7 @@ def _generic_qpt_labels() -> Dict[str, Tuple[float, float, float]]:


def _get_qpt_label(qpt: np.ndarray,
point_labels: Dict[str, Tuple[float, float, float]]
point_labels: dict[str, tuple[float, float, float]]
) -> str:
"""
Gets a label for a particular q-point, based on the high symmetry
Expand Down
16 changes: 8 additions & 8 deletions release_tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
# Use conda to set up the python environments to run in
requires = tox-conda
# The python environments to run the tests in
envlist = pypi-py38-min,conda-py38-old-np,{pypi,conda}-{py38,py39,py310,py311},pypisource-{py38,py311}
envlist = pypi-py310-min,conda-py310-old-np,{pypi,conda}-{py310,py311,py312},pypisource-{py310,py312}
# Skip the execution of setup.py as we do it with the correct version in commands_pre below
skipsdist = True

Expand All @@ -11,7 +11,7 @@ changedir = tests_and_analysis/test
test_command = python run_tests.py --report

# Test PyPI source distribution
[testenv:pypisource-{py38,py311}]
[testenv:pypisource-{py310,py312}]
install_command = python -m pip install {opts} {packages}
deps =
numpy
Expand All @@ -24,7 +24,7 @@ commands_pre =
commands = {[testenv]test_command}


[testenv:pypi-{py38,py39,py310,py311}]
[testenv:pypi-{py310,py311,py312}]
install_command = python -m pip install {opts} {packages}
deps =
numpy
Expand All @@ -36,10 +36,10 @@ commands_pre =
--only-binary 'euphonic'
commands = {[testenv]test_command}

[testenv:pypi-py38-min]
[testenv:pypi-py310-min]
install_command = python -m pip install --force-reinstall {opts} {packages}
deps =
numpy==1.19.5
numpy==1.21.3
ajjackson marked this conversation as resolved.
Show resolved Hide resolved
commands_pre =
python -m pip install --force-reinstall \
-r{toxinidir}/tests_and_analysis/minimum_euphonic_requirements.txt
Expand All @@ -50,7 +50,7 @@ commands_pre =
--only-binary 'euphonic'
commands = {[testenv]test_command}

[testenv:conda-{py38,py39,py310,py311}]
[testenv:conda-{py310,py311,py312}]
whitelist_externals = conda
install_command = conda install {packages}
conda_channels =
Expand All @@ -65,7 +65,7 @@ commands = {[testenv]test_command} -m "not brille"

# Test against a version of Numpy less than the latest for Conda
# See https://github.com/conda-forge/euphonic-feedstock/pull/20
[testenv:conda-py38-old-np]
[testenv:conda-py310-old-np]
whitelist_externals = conda
install_command = conda install {packages}
conda_channels =
Expand All @@ -74,7 +74,7 @@ conda_channels =
conda_deps =
--file={toxinidir}/tests_and_analysis/tox_requirements.txt
commands_pre =
conda install numpy=1.20
conda install numpy=1.22
conda install -c conda-forge euphonic={env:EUPHONIC_VERSION} matplotlib-base pyyaml h5py
# Brille not available on conda
commands = {[testenv]test_command} -m "not brille"
Loading
Loading