The distribution of video contents generated by Content Providers (CPs) significantly contributes to increase the congestion within the networks of Internet Service Providers (ISPs). To alleviate this problem, CPs can use caching to store a portion of their catalogues in servers (i.e., the caches) inside the ISP network and, from there, directly serve contents to users. CP caches can significantly improve the overall QoS perceived by its users (e.g., average retrieval latency is reduced) and, for this reason, caching has been considered a form of traffic prioritization in recent researches about Network Neutrality. Therefore, as storage capacity of caches is limited, its subdivision among several CPs may lead to discrimination. A possible approach to perform a neutral division of the cache is to assign to each CP the same portion of storage. However, this static subdivision does not consider the different popularities of the CPs’ contents and is therefore inefficient. A more effective approach consists in dividing the cache among the CPs proportionally to the popularity of their contents. However, CPs consider this information sensitive and are reluctant to disclose it. In this work, we propose a protocol based on Shamir Secret Sharing (SSS) scheme that allows the ISP to calculate the portion of cache storage that a CP is entitled to receive while guaranteeing network neutrality and resource efficiency, but without violating its privacy. The protocol is executed by the ISP, the CPs and a Regulator Authority (RA) that guarantees the actual enforcement of a fair subdivision of the cache storage and the preservation of privacy. We perform extensive simulations and prove that our approach leads to higher hit-rates (i.e., percentage of requests served by the cache) with respect to the static one. The advantages are particularly significant when the cache storage is limited.

An Open Privacy-Preserving and Scalable Protocol for a Network-Neutrality Compliant Caching / Andreoletti, D.; Rottondi, C.; Giordano, S.; Verticale, G.; Tornatore, M.. - ELETTRONICO. - 2019-:(2019), pp. 1-6. (Intervento presentato al convegno 2019 IEEE International Conference on Communications, ICC 2019 tenutosi a Shanghai International Convention Center, chn nel 2019) [10.1109/ICC.2019.8761596].

An Open Privacy-Preserving and Scalable Protocol for a Network-Neutrality Compliant Caching

Rottondi C.;
2019

Abstract

The distribution of video contents generated by Content Providers (CPs) significantly contributes to increase the congestion within the networks of Internet Service Providers (ISPs). To alleviate this problem, CPs can use caching to store a portion of their catalogues in servers (i.e., the caches) inside the ISP network and, from there, directly serve contents to users. CP caches can significantly improve the overall QoS perceived by its users (e.g., average retrieval latency is reduced) and, for this reason, caching has been considered a form of traffic prioritization in recent researches about Network Neutrality. Therefore, as storage capacity of caches is limited, its subdivision among several CPs may lead to discrimination. A possible approach to perform a neutral division of the cache is to assign to each CP the same portion of storage. However, this static subdivision does not consider the different popularities of the CPs’ contents and is therefore inefficient. A more effective approach consists in dividing the cache among the CPs proportionally to the popularity of their contents. However, CPs consider this information sensitive and are reluctant to disclose it. In this work, we propose a protocol based on Shamir Secret Sharing (SSS) scheme that allows the ISP to calculate the portion of cache storage that a CP is entitled to receive while guaranteeing network neutrality and resource efficiency, but without violating its privacy. The protocol is executed by the ISP, the CPs and a Regulator Authority (RA) that guarantees the actual enforcement of a fair subdivision of the cache storage and the preservation of privacy. We perform extensive simulations and prove that our approach leads to higher hit-rates (i.e., percentage of requests served by the cache) with respect to the static one. The advantages are particularly significant when the cache storage is limited.
2019
978-1-5386-8088-9
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2768912
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo