You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The example sgx-hashmachine involves hashing a string a billion times in an enclave and putting the result in the REPORT DATA of a quote for a remote attestation report.
Since computing the hash billion times is a bit annoying as it takes about 10 minutes, this example may be well suited to show that as an alternative to perform the computation oneself in order to verify that the resulting final hash is correct, knowing the initial string, one may instead "trust" Intel and the remote attestation process, etc.
This is a dummy example to demonstrate how one could gain trust in remote computations that are outsourced to cloud services. In other words, if the remote computations are performed in enclaves that are remotely attested, then one can be somewhat certain that the computations were done according to some known source code. How can one be certain that the deployed and remotely attested enclave was built from some source code? That is the purpose of this example, to show how the auditee tool can be used to automate this verification.
Clarify how to get data to an enclave for applications that require user input in such a way that the integrity of the data is preserved and guaranteed. In other words, how can we prevent the untrusted part of the application to change the data submitted by the user? Is the [sigma protocol](https://en.wikipedia.org/wiki/Proof_of_knowledge#Sigma_protocols] as presented in https://software.intel.com/content/www/us/en/develop/articles/code-sample-intel-software-guard-extensions-remote-attestation-end-to-end-example.html to establish a shared key the only way? What if a user encrypts data with a public-key generated in an enclave and published in the REPORT DATA of a remote attestation report? The untrusted part of the application may also have access to the public key and encrypt something else and replace the user input and the enclave code cannot know the difference.
Look at sgx iot example.
One way to ensure that the input data was not tampered with is to put a hash of it in the REPORT DATA of a remote attestation.
The text was updated successfully, but these errors were encountered:
sbellem
changed the title
Document example with sgx-ra-sample
Document example with sgx-hashnut
May 6, 2021
sbellem
changed the title
Document example with sgx-hashnut
Document example with sgx-hashmachine
May 6, 2021
The example sgx-hashmachine involves hashing a string a billion times in an enclave and putting the result in the REPORT DATA of a quote for a remote attestation report.
Since computing the hash billion times is a bit annoying as it takes about 10 minutes, this example may be well suited to show that as an alternative to perform the computation oneself in order to verify that the resulting final hash is correct, knowing the initial string, one may instead "trust" Intel and the remote attestation process, etc.
This is a dummy example to demonstrate how one could gain trust in remote computations that are outsourced to cloud services. In other words, if the remote computations are performed in enclaves that are remotely attested, then one can be somewhat certain that the computations were done according to some known source code. How can one be certain that the deployed and remotely attested enclave was built from some source code? That is the purpose of this example, to show how the auditee tool can be used to automate this verification.
One way to ensure that the input data was not tampered with is to put a hash of it in the REPORT DATA of a remote attestation.
The text was updated successfully, but these errors were encountered: