In particular, it is believed that the algorithms that set these limits may have been designed to be more prejudiced against women. The responsible authorities have contacted her Goldman Sachs, which manages the Apple Card.
Any discrimination, intentional or not, "violates New York law," the DFS (Department of Financial Services) said.
As the Bloomberg news agency reported, technologist David Heinemeier Hansson complained that his Apple card gave him 20 times the credit limit his wife gave him, even though she had better credit.
Later, Mr Wozniak, co-founder of Apple, said that the same thing happened with his wife, despite the fact that he did not have separate bank accounts or separate assets.
However, this does not seem to be the only case of discrimination in its world technology. According to Mr. Hansson, creator of the Ruby on Rails programming tool, algorithms can discriminate.
The US health group, UnitedHealth Group, is also under investigation as there are allegations that one algorithm favored white patients over blacks.
Mr Hansson described the Apple credit card as a sexist program, in a tweet and said that as soon as he raised the issue, his wife's credit limit had increased.
The DFS said it "will conduct an investigation to determine whether New York law has been violated and to ensure that all consumers are treated equally regardless of gender."
"Any algorithm that intentionally or unintentionally leads to discrimination against women or any other protected class violates New York law."
On Saturday, the investment bank told Bloomberg: "Our credit decisions are based on the solvency of the client and not on factors such as gender, race, age, sexual orientation or any other discrimination prohibited by law."
No one is still sure what is really going on. However, there is a suspicion that an inadvertent bias has entered system.
This could be due to the fact that when the algorithms, they were trained in a dataset in which women actually posed a greater financial risk than men. This could force it software offer lower credit limits for women in general, even if the assumption underlying it does not apply to the general population.
Alternatively, problem can be located at data which are now provided in the algorithms. For example, in married couples, men may be more likely to get large loans using only their name rather than having done so with their spouses, and the data may not have been adjusted to take this into account.