We want to construct a box whose base length is 5 times the…

We want to construct a box whose base length is 5 times the base width. The material used to build the top and bottom cost $3 per square feet and the material used to build the sides cost $2 per square feet. If the box must have a volume of 100 cubic feet determine the Width that will minimize the cost to build the box. (Note: The dimensions are specifically defined so that which dimension is the width is not ambiguous!)

Potpourri-1 Potpourri 1. [2 points] (Answer True/False with…

Potpourri-1 Potpourri 1. (Answer True/False with justification) Your friend is submitting a map reduce job. Her input data is split into 100 shards. Ignoring set up time, the actual time for the map function to execute on an input shard is T. To speed up the map phase, she asks for 200 nodes. With this allocation, the map phase on the entire input will complete in T/2 time units. 

Failures_and_Recovery_6c Quicksilver The context for this qu…

Failures_and_Recovery_6c Quicksilver The context for this question is same as previous. You are designing a distributed database system on top of the Quicksilver operating system. The system allows clients to perform “read” and “write” operations on data stored across multiple nodes. Each operation generates a transaction tree for recovery purposes. For persistent changes made during a “write” operation, what cleanup protocol would be appropriate? Why?

Internet_Scale_Computing_4 CDN In principle, the Coral syste…

Internet_Scale_Computing_4 CDN In principle, the Coral system’s key-based routing serves the same fundamental purpose as traditional DHT routing: enabling a source node to locate and communicate with the appropriate destination node. Given this, what specific problem does Coral’s key-based routing address that traditional DHT routing does not? 

Internet_Scale_Computing_1a Giant Scale Services You are dep…

Internet_Scale_Computing_1a Giant Scale Services You are deploying a large-scale machine learning model for inference in a cloud data center. The model is 960 GB in size and can be broken down into 8 GB chunks that must be executed in a pipelined manner. Each chunk takes 0.8 ms to process. The available machines each have 8 GB of RAM. You are required to serve 600,000 queries per second. Assume there is perfect compute and communication overlap, and no additional intermediate memory usage during execution. What is the minimum number of machines required to support this throughput? You are free to assume pipelined execution of chunks for this.