Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix the signature inconsistency issue in Gramine #1099

Closed
wants to merge 11 commits into from
Closed

Fix the signature inconsistency issue in Gramine #1099

wants to merge 11 commits into from

Conversation

henshy
Copy link

@henshy henshy commented Apr 16, 2024

Fix the signature inconsistency issue in Gramine by excluding some TensorFlow .so files from the trusted list

huangxu17 and others added 11 commits April 8, 2024 20:53
To ensure code consistency and implement hash signature-based remote attestation for the FedLearner framework and its core dependencies, an environment variable has been introduced in Gramine that prevents the generation of .pyc files.
2. Gramine Template Configuration for FedLearner
The Gramine template configuration has been updated to include the code locations for the FedLearner framework and some essential dependencies. This addition facilitates bidirectional remote attestation between parties.
3. Meituan HDFS File Path Management Optimization
The code responsible for handling file paths in Meituan's Hadoop Distributed File System (HDFS) has been migrated from the main entry point to be processed by the master node. This change aims to prevent file read and write conflicts that could arise from multiple workers operating simultaneously.
# Conflicts:
#	sgx/gramine/CI-Examples/generate-token/python.manifest.template
…s that change with each compilation to the Gramine configuration, rather than the entire API folder.
@henshy henshy closed this by deleting the head repository Apr 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant