You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We can consider the benefits and the costs of the necessary equipment, energy, etc. to decide on which operations should be executed by terminal computers and which by computers that could play the role of network nodes.
Computer programs were once installed on (terminal) computers; they used their computing power to process data.
It seems that server computers (serving computers) are tasked today with most data processing operations, while client computers (served computers or terminal computers) do little and store few data. They usually store data temporarily, until servers update the data.
How much does a good computer cost?
One can buy an Intel Core i7-14700K processor for USD 590 and 128 GB of RAM for USD 390: a total of USD 980. An Nvidia GeForce RTX 4070 graphics card can cost USD 800. Some people avoid spending more than USD 1k on a computer. One can buy a good computer at this price. It can have enough processing power to run the programs one needs.
Exchanging messages
3 people create a conversation by creating messages.
a. Person 1 creates message 1 on computer 1.
The program they use can execute the commands of user 1 using the resources of computer 1.
While it can store the data of message 1 on computer 1, it must copy message 1 onto computer 2 and computer 3. Even if it doesn’t always store it on these 2 computers, it must make message 1 available to user 2 and user 3 whenever they ask for it.
I think any human being should be able to set a program to always store such data on whatever computer they choose.
b. Person 2 creates data set 2 on computer 2.
They let the program notify person 1 and person 3.
Irrespective of where it executes the necessary operations, the program must make these notifications available on computer 1 and computer 3.
A main activity of the Internet is copying data between computers. It seems that almost all copies are temporary.
Any user can leave their computer on for as long as they like. One could also leave some programs open, so that they can make available updates to remote participants. A program could notify one when some data become available because a computer has come online, so a terminal computer could serve data for less than 24 hours a day.
Otherwise, any data could be copied to at least one other computer, which is going to serve data with hardly any interruptions. It seems one has these options:
a. I operate at least one server.
b. I rent remote computers using a service like Microsoft Azure or MaidSafe.
One can run any program through servers. Software developers choose the servers usually. Sol can help users choose the servers themselves.
I have the feeling we could task terminal computers with more.
What are your related feelings and thoughts?
The text was updated successfully, but these errors were encountered:
We can consider the benefits and the costs of the necessary equipment, energy, etc. to decide on which operations should be executed by terminal computers and which by computers that could play the role of network nodes.
Computer programs were once installed on (terminal) computers; they used their computing power to process data.
It seems that server computers (serving computers) are tasked today with most data processing operations, while client computers (served computers or terminal computers) do little and store few data. They usually store data temporarily, until servers update the data.
How much does a good computer cost?
One can buy an Intel Core i7-14700K processor for USD 590 and 128 GB of RAM for USD 390: a total of USD 980. An Nvidia GeForce RTX 4070 graphics card can cost USD 800. Some people avoid spending more than USD 1k on a computer. One can buy a good computer at this price. It can have enough processing power to run the programs one needs.
Exchanging messages
3 people create a conversation by creating messages.
a. Person 1 creates message 1 on computer 1.
The program they use can execute the commands of user 1 using the resources of computer 1.
While it can store the data of message 1 on computer 1, it must copy message 1 onto computer 2 and computer 3. Even if it doesn’t always store it on these 2 computers, it must make message 1 available to user 2 and user 3 whenever they ask for it.
I think any human being should be able to set a program to always store such data on whatever computer they choose.
b. Person 2 creates data set 2 on computer 2.
They let the program notify person 1 and person 3.
Irrespective of where it executes the necessary operations, the program must make these notifications available on computer 1 and computer 3.
A main activity of the Internet is copying data between computers. It seems that almost all copies are temporary.
Any user can leave their computer on for as long as they like. One could also leave some programs open, so that they can make available updates to remote participants. A program could notify one when some data become available because a computer has come online, so a terminal computer could serve data for less than 24 hours a day.
Otherwise, any data could be copied to at least one other computer, which is going to serve data with hardly any interruptions. It seems one has these options:
a. I operate at least one server.
b. I rent remote computers using a service like Microsoft Azure or MaidSafe.
One can run any program through servers. Software developers choose the servers usually. Sol can help users choose the servers themselves.
I have the feeling we could task terminal computers with more.
What are your related feelings and thoughts?
The text was updated successfully, but these errors were encountered: