1. Introduction
Internet-of-Things (IoT) applications, such as financial services, intelligent transportation, and remote sensing, are envisioned to be an important driving force of the smart society [
1]. It can be seen, with the popularization of IoT applications, trillions of new devices will be connected to the networks for different application needs, such as smart wearable devices, smart homes, environmental monitoring, and so on [
2]. A variety of IoT applications require IoT networks deployed remotely over large areas (e.g., environmental monitoring). However, due to the fact that the infrastructure construction is affected by the geographical environment, terrestrial wireless systems cannot achieve comprehensive coverage, especially in particular environments such as deserts, oceans, and forests [
3]. As an extension and supplement of the terrestrial communication system, low earth orbit (LEO), medium earth orbit (MEO), and geostationary earth orbit (GEO) satellites can be used in areas that are not covered by terrestrial networks [
4]. For years, industry and academia have focused on the GEO satellites to provide Internet access. But these high-orbiting satellites face significant challenges in providing services. For instance, signals need to span a distance of 36,000 km from Earth to the satellite’s return, and time delays often exceed 600 ms, which cannot meet many existing service demands [
5]. In contrast, the LEO satellite systems between 160 and 2000 km can provide low-latency, high-speed Internet connectivity. Thus, LEO has attracted considerable attention from both academia and industry in recent years [
6,
7,
8,
9].
To achieve ubiquitous and massive IoT connectivities, the access enabled by the LEO satellites will be the potential solution for supporting large-scale communications of future smart devices [
2,
10]. The LEO satellite communication (SatCom) system has the characteristics of global coverage (including the polar regions), on-demand access, and large capacity, which can effectively solve the shortage of the terrestrial communication system for IoT communications [
11]. One promising use case for such consideration is that the LEO satellite can act as a base station to monitor disasters in the forest or mountainous area, such as fires, earthquakes, and mudslides, by collecting the sensing information on the IoT devices [
12,
13]. However, there are several challenges to the use of LEO satellites for IoT communications, notably in terms of spectrum shortage issues. On one hand, a growing variety of wireless communication systems lead to an increase in demand for radio spectrum. Nowadays, the available spectrum for satellites has already been allocated for specialized use such as remote sensing, navigation, telecommunications, etc. Thus, it is challenging to allocate a specified spectrum for LEO satellite-enabled IoT communications. On the other hand, unlike human-to-human communications, IoT communications are generally infrequent transmissions and occasionally exchange small amounts of data when transmitting [
14,
15,
16]. In this regard, it would be wasteful to launch a dedicated satellite for IoT communications.
To deal with the above challenges, cognitive radio (CR) is a potential solution [
17,
18,
19]. Specifically, CR allows the secondary users (SUs) to use the same spectrum as the primary users (PUs). In more detail, the PUs can use the spectrum resources without any constraint, while the SUs should use the same spectrum resources without significantly affecting the PU [
20]. CR can be further categorized into overlay and underlay modes. In an overlay mode, SUs access the spectrum by sensing the PUs’ unused spectrums (spectrum holes), and when PUs appear, they must immediately release the occupied band. Therefore the overlay mode can minimize interference to PUs, while must take precise spectrum sensing for SUs, which consumes a lot of energy. This is very unfriendly to IoT devices, which are usually equipped with limited batteries and are very sensitive to energy consumption [
21]. In an underlay mode, the SUs transmission can coexist with the PUs transmission simultaneously, provided that the caused interference to the primary transmission is below a tolerable threshold. In this way, IoT can transmit at any time without waiting for spectrum holes.
By introducing underlay CR into SatComs, IoT users can serve as the SUs and intelligently exploit any available LEO satellite spectrum while avoiding interference with legacy LEO satellite users [
22,
23]. In the cognitive LEO satellite system, it is important to prevent and minimize interference between the licensed PUs and SUs. This means that resource allocation and power control are inevitably applied to the secondary system to meet the constraints imposed by the interfering primary system [
24,
25,
26,
27]. Specifically, the spectrum sharing between the GEO and LEO SatCom systems is investigated in [
28,
29] to maximize the sum rate of the GEO and LEO SatCom systems and the sum rate of the LEO SatCom system, respectively. In [
30], the power control algorithms are proposed to maximize the delay-limited capacity of the secondary system for a cognitive LEO satellite with terrestrial systems. In [
31], the spectrum sharing in LEO and high-altitude platform (HAP) cognitive system is studied for resource allocation based on the imperfect spectrum sensing. The LEO satellite serves the sensors by sharing the spectrum of the GEO SatCom system [
32]. The scenario where the satellite network reuses the spectrum resources of the terrestrial network is considered in [
33] and the energy efficiency is maximized by designing the power control scheme. Nevertheless, most of the above works aim to optimize the resource of the secondary system under the fixed parameters for the primary transmission. Actually, the performance of the primary and secondary systems are coupled together. It is significant to propose a joint resource allocation scheme that is applicable to both legacy satellite users and cognitive satellite IoT users.
In this paper, we focus on an uplink cognitive LEO SatCom system to support IoT transmissions, in which the IoT communication system is served as the secondary system to share the spectrum of the legacy LEO SatCom system. In addition, the primary and secondary systems share the same receiver, i.e., the satellite base station. As the current LEO SatCom system widely uses code division multiple access (CDMA) technology [
34,
35,
36,
37], the spread spectrum signal is distributed over a wide frequency range, making it difficult to detect the power of the primary user range [
38]. The collaborative underlay spectrum sharing model is more suitable for the cognitive LEO satellite IoT system than the overlay spectrum, which requires precise spectrum sensing. With perfect power control, the underlying IoT users can use the same spread spectrum as the legacy satellite users without degrading the PU’s communication quality, which greatly improves spectrum efficiency. In particular, IoT performance can be improved by using CDMA to successfully transmit more packets per unit time [
39] and achieve better spectrum efficiency for IoT transmission than other orthogonal channel allocations [
40]. Considering the properties of IoT transmission and in order to be compatible with the existing LEO system, in this paper, we consider supporting the cognitive satellite IoT communication in the CDMA manner. We are interested in the achievable rate and resource allocation in the cognitive LEO satellite networks. Specifically, the minimum mean square error (MMSE) detector is used to recover the information from the legacy LEO satellite users and the IoT users. Due to the randomness of spreading codes, it is difficult to allocate the resources directly. Thus, we use the random matrix theory to analyze the asymptotic signal-to-interference-plus-noise ratio (SINR) and obtain the achievable rate for both IoT and legacy systems. Moreover, we aim to jointly optimize the receive power of the legacy and IoT transmissions by maximizing the sum rate of the IoT users under the condition that the legacy satellite system can meet its performance requirement. To solve the formulated problem, we prove that the sum rate of the IoT users is quasi-concave over the legacy satellite user’s receive power. Based on such characteristics, we derive the optimal receive powers for these two systems. With the designed power allocation scheme, IoT users can achieve information transmission on the basis of ensuring the performance of the primary system. Finally, extensive simulation results are provided to validate the effectiveness of the designed power allocation scheme, which shows that IoT transmission can achieve information transmission on the basis of ensuring the performance of the primary system.
The rest of this paper is organized as follows. In
Section 2, we build up the cognitive LEO satellite communication system model. In
Section 3, we derive the asymptotic SINR for the legacy and IoT systems. In
Section 4, the resource allocation problem is formulated and solved.
Section 5 presents extensive simulation results which verify our theoretical analysis and validate the effectiveness of the proposed scheme. Finally, the paper is concluded in
Section 7.
The notations used in this paper are listed as follows. The lowercase, boldface lowercase, and boldface uppercase letters x, , and denote a scalar variable (or constant), vector, and matrix, respectively. denotes the complex Gaussian distribution with mean and variance . Notations and denote the transpose and conjugate transpose of matrix , respectively. The notation denotes the optimal value of variable . denotes the N-dimensional identity matrix. Notation denotes the statistical expectation. The notation denotes the Hadamard (element-wise) product.
2. System Model
For the cognitive LEO satellite system, there are two types of networks: the legacy satellite network and the IoT network. The LEO satellite base station serves the legacy satellite users and the cognitive IoT users simultaneously in the CDMA manner. Meanwhile, IoT users share the same spectrum as legacy satellite users. The two kinds of systems also share the same LEO satellite receiving antenna. In this setup, the cross-channel state information (C-CSI) among the two systems is easy to obtain. We assume that the satellite and all user terminals are deployed with a single antenna. In the cognitive LEO satellite system, we consider one uplink case, as shown in
Figure 1. Specifically, there are
U primary users in the legacy satellite network and
K secondary users in the IoT network. Each user in both the legacy satellite system and the IoT system is assigned a specific random spreading code with spreading gain
N. In this way, the symbol duration of IoT is the same as that of the primary satellite.
In what follows, we will illustrate the channel model of the cognitive LEO satellite system and then the signal model followed by the achievable rates of both primary and secondary systems.
2.1. Channel Model
The satellite channel fading consists of two parts, including multipath propagation and shadow effect. The shadowed Rice model can effectively describe the two parts of fading effect, and has been widely applied in various frequency bands such as the UHF-band, L-band, S-band, and Ka-band channel analysis [
41]. Specifically, the satellite channel fading coefficient between the satellite and the
m-th user is given by
where
is the stationary random phase and
is the deterministic phase of the line-of-sight (LOS) component.
A and
Z are both independent stationery random processes. Specifically,
A is the amplitude of the scatter component following Rayleigh distributions and
Z is the amplitude of LOS component following the Nakagami distribution, which is given by
Considering the atmospheric effects, the
m-th user overall satellite channels can be expressed as
where
denotes the free-space path loss coefficient with wavelength
, the height of LEO satellite
, and the distance
between the centers of the LEO satellite coverage area and
m-th user.
Based on the above channel model, let
denote the channel coefficient from the
u-th legacy satellite user to the LEO satellite and
denote the channel from the
k-th IoT device to the LEO satellite. All of the channels
and
for
and
satisfy the expression of (
3).
2.2. Signal Model
Denote by and the data symbol that legacy satellite user u transmits and the data symbol that IoT user k transmits, respectively. Symbol is spread by the spreading code , and symbol is spread by the spreading code . The spreading codes and satisfy the power constraints, i.e., , with the covariance of , , where and are the maximum transmit power of the u-th legacy satellite user and the k-th IoT device, respectively.
The total received signal at the satellite receiver corresponding to the
n-th spreading code can be written as,
for
, where
is the additive white Gaussian noise with
. For simplification, the received signal (
4) can be rewritten as
where
is
matrix formulated by the spreading codes of legacy satellite users, and
is
matrix formulated by the spreading codes of IoT users,
is
vector whose entries are the channel response of the legacy satellite users,
is
vector composed by the channel resonses of the IoT users,
is
vector whose entries are the data symbols the legacy satellite users transmit,
is
vector composed by the data symbols the IoT users transmit,
is
additive white Gaussian noise vector, i.e.,
.
2.3. Achievable Rates
Suppose the MMSE detector is adopted by the LEO satellite to recover the information from both primary and secondary transmissions. For the primary transmission, let
be the MMSE vector used to detect the
u-th element of
, we can get
Thus, the SINR of the legacy satellite user
u can be calculated as
The MMSE vector
is designed by minimizing the mean-square error (MSE) between the processed signal and the transmitted symbols. Define the MSE function for the legacy satellite user
u as
Then the MMSE vector
, which minimizes the MSE function
, is represented as
Based on the analysis in [
42], the SINR of legacy satellite user
u can be evaluated as
Accordingly, the achievable rate for the
u-th legacy satellite user can be written as
Similarly, we use the MMER detector to recover the transmitted signal by the IoT users. Based on the above analysis, we can obtain the SINR of the IoT users. Specifically, the SINR of the
k-th IoT user is given by
Then, the achievable rate for the
k-th IoT user can be written as
From (
9) and (
11), we can find that the legacy satellite user’s SINR depends on the spreading codes of the primary system as well as those of the secondary system. Because of the randomness of spreading codes, it is difficult to exactly calculate the SINR of the legacy satellite user and allocate resources for these two systems. Thus, in
Section 3, we will analyze the asymptotic SINRs for both primary and secondary transmissions.
3. Asymptotic Analysis
In this section, we analyze the asymptotic SINRs for both primary and secondary systems in order to allocate the resource for the two systems.
We consider a large cognitive LEO SatCom system, in which the number of users is large, i.e., and . To support a large number of users, it is reasonable to scale up N as well, i.e., , but converges to a constant parameter , which represents the legacy satellite system load. Similarly, we have that converges to a constant parameter , which represents the IoT system load.
To analyze the asymptotic SINR, we first present the following proposition.
Proposition 1 (Theorem 3.1 of [
42]).
For a symbol-synchronous multi-access spread-spectrum system with spreading gain N, the SINR for the 1-st user in a M user system is deterministic and approximately satisfies,whereand denotes the received power of the user i. From Proposition 1, the asymptotic SINR is determined by the received power for each user. Based on (
13) and (
14), the asymptotic SINR of the
u-th legacy satellite user satisfies:
As both the primary and secondary systems are based on CDMA, the method for analyzing the asymptotic SINR for the IoT users is the same as the one for the legacy satellite system. Thus, the asymptotic SINR of the
k-th IoT user is given by:
From (
15) and (
16), we find that the asymptotic SINRs for both primary and secondary systems are related to the received power for all of the links. To simplify the process of resource allocation, we assume that the received powers for legacy satellite users are the same one, which is given by
q. This is achieved by perfectly power control. Similarly, the received power for the IoT users is the same one, which is given by
p. Then, (
15) can be simplified as
which gives
Similarly, (
16) can be simplified as
4. Joint Resource Allocation in Coginitve LEO SatComm System
In this section, we will formulate and solve the resource allocation schemes to maximize the sum rate of all IoT users under some constraints. First, we will first investigate the optimal IoT receive power and the optimal joint legacy satellite use and IoT user receive powers. Then, based on the solved optimal power, we will exploit the optimal number of IoT users. Finally, we will discuss the effect of the non-synchronous between the primary and secondary systems.
4.1. Resource Allocation
To protect the legacy satellite service, we have to guarantee its SINR no less than the target value,
, i.e.,
, and guarantee the IoT user receive power no more than the limit value,
, i.e.,
. Based on the above analysis, our first resource allocation problem in the CR system tries to maximize the sum rate for the IoT system, which can be formulated as
,
From (
19), it is difficult to get the closed-form expression for
. Similarly, the closed-form expression for
is also difficult to obtain. To overcome this problem, we provide the following lemma.
Lemma 1. For any and q, if we have , then, we have .
From Lemma 1, we can find that the objective function increases with the growth of
p. However, the increase in
p will decrease the asymptotic SINR of the legacy satellite users. Similar to the Lemma 1, if
, then,
. Thus, when
q is small, the SINR requirement constraint, i.e.,
is dominant while the power limit is not effective. So ignoring the power constraint, we can get
:
For the optimization problem
, when
p is at the maximum value, the sum rate of the IoT users will be at the maximum value, while
will decrease. Thus, the objective function is maximized when the equality constraint holds, i.e.,
. Accordingly, we can derive the optimal
p of the IoT receive power as follows
When
q is large enough, the value of
p has achieved the power constraint while
does not reach
. So ignoring the first constraint, we can get
:
Due to the fact that
will increase with
p, the optimal receive power for the IoT users is
4.2. Joint Resource Allocation
From (
21), we know that the optimal receive power of the IoT users is relevant to the received power of the legacy satellite system. With perfect power control, we could adjust the receive power
q of the legacy satellite users to maximize the sum rate of the IoT users. Thus, we formulate the joint resource allocation problem as
:
For every given q, there must be an optimal IoT user receive power p which can ensure the maximization of IoT system’s sum rate. To overcome the joint resource allocation problem, we will find the influence of q to with the optimal IoT user receive power p, based on which we provide the following analysis.
Lemma 2. When with the optimal IoT user receive power p, if , then, we can have .
Lemma 3. When , with the optimal IoT user receive power p, if , then, we can have .
Note that the values of and will be discussed later. From Lemmas 2 and 3, we know that when , increases with q, and when , decreases with q. This indicates that the sum rate of the IoT systems is quasi-concave over the legacy satellite receive power. As a result, when , , the sum rate of the IoT system is maximized. Meanwhile, if we have , the IoT users can not transmit signals i.e., .
Next, we will focus on the investigation of the values of
and
. When the legacy satellite system cannot tolerate the interference of the IoT system, the receive power of legacy satellite system is minimal, i.e.,
Solving this equation for
, we can get
When
, (
21) and (
23) both hold, which gives
By solving (
26), we can get the result of
.
As aforementioned, the optimal joint resource allocation scheme is related to the primary and secondary system loads, i.e., and . Intuitively, the growth of the number of IoT users will contribute to the new increments in the sum rate, however, leading to the increase in interference. Thus, there exists a tradeoff in the number of IoT users. In the following, we aim to exploit the optimal user number of the IoT system to maximize the sum rate of the IoT system for a given N.
4.3. Optimal IoT User Number
Here, we will illustrate the optimal IoT user number to maximize the sum rate of the IoT transmissions, subject to the receive power constraint and the primary SINR requirement constraint, which can be mathematically formulated as
:
For every given
K, there exists one optimal
p and
q, which can ensure the maximization of the sum rate of the IoT system. When given
p and
q, the optimization of
K involves the log function and non-closed formula of
. Thus, it is difficult to gain the closed-form of the optimal
K when given
p and
q. In fact, when
K is small, the sum rate of the IoT users increases with the growth of
K. When
K is large, the sum rate of the IoT users decreases with the growth of
K since in this case, the SINR of the IoT users will dominate the sum rate of the IoT system. Based on this fact, we can apply the one-dimensional search scheme to find the optimal tuple
to solve the optimization problem
. Specifically, for each
K, we calculate the optimal
and
. By searching different
K, we can gain multiple tuples
. By comparing the sum rate of the IoT system with
, we can obtain the optimal tuple
. The details of the proposed algorithm for solving problem
are summarized in Algorithm 1. Specifically, we first initialize
K and
, where the initial
K is a small number and
is greater than one. Then, we iteratively update
p,
q, and
K until the objective function gradually decreases.
Algorithm 1. Solution to |
Initialize and set ; Repeat Calculate and based on ( 26) given ; ; Until the objective function of gradually decreases. Obtain optimal tuple . |
4.4. Non-Synchronous Uplink
In the above, the entire analysis is that of a synchronous uplink cognitive LEO satellite spectrum sharing system. In this section, we extend to the non-synchronous case.
For the non-synchronous uplink cognitive LEO SatCom system, the total received signal at the satellite base station can be written as,
where
is
matrix formulated by the spreading codes of the IoT users,
is
vector composed by the data symbols the IoT users transmit, and
L is the synchronous error between the primary and the secondary systems. Based on the discussions in
Section 2, we can obtain the MMSE output for the non-synchronous uplink cognitive LEO SatCom system, which is given by
From (
29), we can find that although the primary and secondary systems are not synchronous perfectly, the MMSE output of the non-synchronous uplink system is similar to that of the synchronous uplink system. Thus, the analysis and scheme of resource allocation of the non-synchronous uplink system are similar to that of the synchronous uplink system and thereby omitted.
5. Simulations Results
In this section, simulation results are presented to evaluate the performance of the proposed cognitive LEO satellite communication system. The spreading gain for the legacy and IoT systems is set to , which is large enough to verify the asymptotic results obtained in this paper. Although the LEO satellite system involves the shadowed Rice channel, the resource allocation schemes in this paper focus on the receive signal-to-noise ratio (SNR). Due to the fact that the performance of the legacy and IoT systems is related to the SINR, here, we use the receive SNR, i.e., and , to evaluate the performance of the proposed cognitive LEO satellite communication system. Specifically, the white Gaussian noise power is normalized to . We set the target SINR threshold for the legacy satellite system to 5 dB. To show the effectiveness of our proposed framework and algorithms, we set two benchmarks, which are illustrated as follows:
Benchmark 1: To show the advantages of spectrum sharing, we show the performance of the legacy satellite systems without spectrum sharing, whose SINR with the MMSE detector is given by . Accordingly, the sum rate of the legacy satellite users is given by .
Benchmark 2: To show the advantages of the joint resource allocation scheme, we show the performance by only optimizing the receive power of the IoT devices based on the analysis in
Section 4.1.
Firstly, we evaluate the asymptotic SINR of the legacy satellite system by comparing the simulated SINR with the theoretical SINR with the MMSE detector. We set
,
, and
dB. The calculation of simulated SINR is based on random spreading codes which are assigned to each user in the legacy satellite system, as shown in (
9). The theoretical SINR with the MMSE detector can be calculated by (
18). As shown in
Figure 2, the dense small circles are the simulated SINRs, while the big circles are the theoretical SINRs for the MMSE detector. It is seen that the theoretical SINR can be considered as the statistical mean value of the simulated SINRs. Thus, it is reasonable to formulate the joint resource allocation problem based on the asymptotic SINR. In addition, we can find that the interference from the IoT users and other legacy satellite users leads to about 4 dB SINR loss with MMSE detector compared with the SNR of the effective legacy satellite user.
Figure 3 and
Figure 4 present the sum rate of the IoT users and the sum rate of the legacy satellite users w.r.t. the legacy satellite receive SNR, i.e.,
, respectively. Here, we set
and
. From
Figure 3, it can be found that the sum rate of the IoT users is quasi-concave over the legacy satellite receive SNR, which is consistent with the results in Lemmas 2 and 3. In addition, from
Figure 3 and
Figure 4, we can find that when
dB, the sum rate of the IoT users is equal to 0, while the sum rate of the legacy satellite users increases with the growth of
. The main reason is that when
dB, we have
, which means that the legacy satellite system cannot tolerate the interference of the IoT system. For
dB, the sum rate of the IoT users increases with
, when
dB. In this case, the sum rate of the legacy satellite users remains unchanged, which indicates that the legacy SINR requirement constraint dominates the power allocation scheme. When the IoT receive power limit is not achieved, for each value of
, the curves are coincident and all the interference margins can be exploited by the IoT system. Note that the interference margin refers to the tolerable interference of the legacy satellite system. When
dB, the receive power limit dominates the power allocation scheme. Note that the tuning points in the two figures are related to the parameters design. With other parameters, the tuning points may change.
Figure 5 presents the sum rate of the IoT and the legacy satellite users w.r.t. the legacy satellite receive SNR, i.e.,
for different legacy user numbers with
dB. As aforementioned, Benchmark 1 indicates the performance of the legacy satellite users without spectrum sharing. From
Figure 5, it is obvious that the proposed spectrum sharing scheme has a higher sum rate, which indicates the effectiveness of the spectrum sharing scheme. Meanwhile, from this figure, we can find that the spectrum sharing scheme has greater performance gains when the number of legacy satellite users is small. The main reason is that the available interference margin increases with the decrease of the legacy user number. In addition, when
dB, the two schemes have the same performance since the legacy satellite system cannot tolerate the interference of the IoT system and the sum rate of the IoT users is equal to 0.
Next, we evaluate the optimal legacy satellite receive power and the optimal sum rate of the IoT user based on the optimized joint power allocation scheme by varying the maximum receive SNR of the IoT users
, as shown in
Figure 6 and
Figure 7. Meanwhile, to show the effectiveness of the joint resource allocation scheme, we also plot the curves of Benchmark 2 in
Figure 7, where we set the
dB. In the two figures, we set
. From
Figure 6, we can find that when
grows, the optimal receive SNR of the legacy user, i.e.,
, also increases. This indicates that the sum rate of the IoT users is maximized with higher
, which is also shown in
Figure 3. Meanwhile, the optimal receive SNR increase with
due to less interference from the legacy satellite users. In
Figure 7, we can observe that, for any
, as
increases, the optimal IoT sum rate increases since the higher
can help the IoT users to exploit more interference margin. Furthermore, we can find our proposed joint resource allocation scheme performs better than Benchmark 2 for any legacy system load, which shows the effectiveness of the joint design. In addition, the optimal IoT sum rate decreases with
since the available interference margin decreases with the increase of
.
Finally, the optimal sum rate of the IoT users w.r.t. the number of the IoT users is shown in
Figure 8. In this figure, we set
. For every given
K, the calculation of the sum rate of the IoT users is based on the optimal
p and
q which can ensure the maximization of the IoT sum rate. It is observed that the optimal sum rate of the IoT users is quasi-concave over the number of the IoT users since the sum rate of the IoT system equals the user number times each user’s rate, and when the user number increases, this, however, decreases the SINR in (
19), resulting in the decrease of each user’s throughput. We also know that the user number must be an integer. Thus, we can traverse
K next to the peak in
Figure 8 to confirm the user number of the IoT system which can make the IoT sum rate reaches the maximum. For example, from
Figure 8, we can observe that when
dB, the optimal secondary user number is around 180. Meanwhile, we can find that in this setup, the receive SNR
has a trivial effect on the optimal number of the IoT users due to the huge effect of the QoS requirements of the legacy satellite users.